Menu Sign In Contact FAQ
Banner
Welcome to our forums

Technology and computers and how good everything was in the old days

Not so in software – the gravy train is still going.

I think yes, in server-side code, and client-side (JS mainly I think). I have had a bit of exposure to this, thankfully without having to do any of it Due to hacking, you have to be either damn good at it, or avoid it totally. I know someone in his late 50s who has spent many years in this area and is totally sick of it, having to read yet another 1000 pages on yet another “paradigm”. In fact I can’t think of anybody who enjoys it, unless they are pretty young. And I think project management in this area works only if you don’t hang around the same company for too long. The complexity is massive and so much broad expertise is needed.

But embedded stuff opportunities have definitely shrunk. The B2B electronics scene, loosely described as industrial control / monitoring, is dominated by big names. Up to maybe 20 years ago a company manufacturing e.g. cakes would have a clever guy there running their gear. He then retired, could not be replaced, and the company went to e.g. Siemens for a turnkey solution. They pay a lot more for stuff which does the same job but they are happy, and all the gear comes in a matching colour scheme.

There are still opportunities but they are much less obvious.

Today’s hardware is great though. I am right now writing loads of C on an ARM 32F417 which runs at 168MHz. It is so fast that everything just happens much faster than it needs to. Reams of code run in a few microseconds, which is just as well since the example code which ST give you is bloated and buggy crap, everybody knows it is crap, and everybody has spent months fixing the same bugs while trawling the internet for dev forums where people report the same problems and, very occassionally, post how they fixed something. The hardware I am working on will be used for both company products and some hobby projects of mine, and it will do absolutely everything I will ever need to build. This chip has enough performance to implement something like a GTN750, KFC225, and a few other bits, all at the same time. On the minus side, the programming reference manual is 2000 pages. The hardware reference is 200 pages and I read every word of it when doing the PCB design.

Administrator
Shoreham EGKA, United Kingdom

Peter wrote:

In fact I can’t think of anybody who enjoys it, unless they are pretty young

I don’t know if you consider 48 “pretty young” but I enjoy it. I like learning new things (and software development is a bad career choice for people who don’t like learning new things).

As with everything you have to be able to avoid the bullshit. There are a lot of new fashionable things (particularly in front end (JS) development), and not all of them are good. Frameworks often seem brilliant time-savers until you realise they are a straitjacket that’s only a brilliant timesaver if you never deviate the slightest from the framework developer’s vision. However, there is good stuff around that isn’t a straitjacket if you’re prepared to look around a bit.

The foundations last for years, though.

Andreas IOM

Peter wrote:

Compuserve indeed had offline reading and it was necessary because you were paying heavily for each second you were connected. However, the software got more and more unwieldy and the last edition of it was so complicated that you were reading huge long discussions about how to use it! In the end people got tired of it, and I recall that was about the end of CS anyway.

CIM was not really an offline reader but simply a platform which allowed you to “see” the contents of CSI in a better way. It did not save you much money. However, there were quite a few offline readers. I started out with DOSCIM and WINCIM but then transferred to Virutal Access, which was an outlook like reader where your mail and fora, not only compuserve but also internet fora supporting the NNTP protocol. For Compuserve, TAPCIS was quite popular too, I never used it though as VA also had all the sysop tools TAPCIS had.

What I liked about this is that you had all the content you ever produced as well as the full forum threads e.t.c. locally available on your harddrive and, in the case of VA, were able to really easily keep it on backup disks or whatever. I recently restarted my old VA installation as I wanted to look something up and it starts even of a memory stick, obviously it can not connect anymore but basically all my discussions and emails from the 199ties are still there. Including backups, the whole thing is less than a few GB.

I must say that if it was possible to still get that functionality onto places like Euroga (NNTP protocol) I’d be back at VA in no time at all. Not because it saves money but because it is an excellent way to keep the stuff for future reference. i.e I lost all my original Flightforum posts in an upgrade a few years back, which is something I am still quite unhappy about because it would be great to have access to this stuff if you need to look up some things.

LSZH(work) LSZF (GA base), Switzerland

I used TAPCIS too. The final version of it became so bloated nobody could understand it.

The idea was similar to Usenet in that you “subscribed” to specific groups, went online to grab new posts, went offline, read them / responded, and went back online to upload your responses.

Usenet was much better than CS because you could be anonymous, so one learnt a lot more than one ever learns in forums where full real names are enforced. I learnt loads from rec.aviation.* – way more than from any formal training. But Usenet was free so most (not all) of it was eventually killed by spam. And then c. 2000 people moved to web based forums which simply worked better, supported pic upload, etc.

Technically one could have NNTP on EuroGA but I suspect the number of users could be counted on the fingers of one hand and without resorting to binary

Re frameworks, sure they save time if you use them out of the box but one can never quite tell which ones will be abandoned in a few years’ time, and what bugs / vulnerabilities there are among the thousands of lines of code. And it is 99% likely the original programmer will be long gone when the chickens come home to roost, and every programmer really hates having to get into code written by somebody else. That is the chief challenge of software projects in any “serious” environment, hence my comment about the manager moving on regularly (it avoids him having to face the music – pardon all the idioms). The EuroGA Airport Database was specced to avoid frameworks for this reason (because I will never be able to escape ) although I wish we used Dropzone for the picture upload as it would likely have avoided some of the strange issues with MACOS and version-specific Safari behaviour (IIRC).

Administrator
Shoreham EGKA, United Kingdom

martin-esmi wrote:

Regarding GSM

It’s odd what survives and what doesn’t. I remember 3G was hot. In the mid 2000s I went on a conference in Japan. At that time I had a new 3G phone from Ericsson. I was the only one in my party that had a workable phone. The others had 2G (GSM), and Japan only had 3G

Already today though, all of 3G is gone in Norway, I think the last masts were closed in January. But GSM is alive, no plans on closing that down. We can use the first Nokias from 1991 (and many still do), but not 3G only phones from mid 2000 (if any such phones ever existed in Europe). Even the NMT 450 MHz network is still up (sort of), but it’s used for 4G data transmissions by ice.net. They have specialized on wireless network at remote locations, and at sea in particular. Unmatchable coverage, also perfect for GA.

The elephant is the circulation
ENVA ENOP ENMO, Norway

2G is presumably GPRS. And EDGE was double speed GPRS.

But there were implementation differences e.g. GPRS in Europe ran at ~20kbps while in the US I was getting 100+kbps on EDGE (Arizona, 2006).

I reckon 90% of Europe’s land area has no 3G/4G so 5G is a bit of a laugh What will 5G actually do?

Aren’t all voice calls (and SMS messages) routed via GSM? I believe they can go via 4G but I don’t think they do. I don’t mean VOIP…

Administrator
Shoreham EGKA, United Kingdom

Peter wrote:

The idea was similar to Usenet in that you “subscribed” to specific groups, went online to grab new posts, went offline, read them / responded, and went back online to upload your responses.

That was exactly how VA worked.

Peter wrote:

Technically one could have NNTP on EuroGA but I suspect the number of users could be counted on the fingers of one hand and without resorting to binary

LOL, well, as I said, I’d be certain to use it.

I never used TAPCIS as I went directly from WINCIM to VA, which had the same features, only they never got in the way of doing the normal stuff.

LSZH(work) LSZF (GA base), Switzerland

Peter wrote:

And EDGE was double speed GPRS.

Edge was the first time I got online TV on my then Nokia N70

LSZH(work) LSZF (GA base), Switzerland

LeSving wrote:

It’s odd what survives and what doesn’t. I remember 3G was hot.

I remember when an older engineering colleague was buying Qualcomm stock and explaining that he thought it was quite a good idea He later retired on that Qualcomm investment in his 50s, attended law school to keep busy and then set up an office doing wills and trusts… He died (cancer) at a relatively young-ish age in a December, and wrote his own obituary as well as his own will! He was always ahead of the game and I wish I’d followed his idea on Qualcomm stock as I’m still here, but nothing lasts forever except maybe 50 year old light aircraft

With that in mind, and to the extent reasonably possible, I maintain my hangar and contents as a software free zone.

Last Edited by Silvaire at 05 Mar 16:30

Speaking of Nokia leadership, I used to have the wonderful Nokia 808 phone, which produced amazing quality photos, as well as crashing if HSPA (a faster version of 3G; it never did 4G) was enabled. It used to crash in specific locations, one of which was a road junction on the way to Biggin Hill airport

Reading between the lines of how some people fixed this, it was a specific (and extremely rare) data packet which created a brief (microseconds) higher current draw, which dipped the 3.3V (or whatever) supply rail below spec, crashing the CPU. The fix was in the form of a capacitor, of course…

3G was fine and HSPA wasn’t needed, and I got rid of the phone in 2015 wholly because Symbian would not run a browser which worked on a lot of websites. Of course my S10e does way more and does it better, and runs telegram and whatsapp, etc, but the 808 did much better photos!

Administrator
Shoreham EGKA, United Kingdom
Sign in to add your message

Back to Top