Menu Sign In Contact FAQ
Banner
Welcome to our forums

Programming languages

A compiler can never outperform a human who knows what he is doing. Very esoteric optimisations excepted but they often bite you in the bum.

One of my school projects was to write one while you could argue a human who understand hardware better can cough assembly binaries that outperform source codes and compilers, that will only work on 1/ specific machines and platforms 2/ it’s not scalable and 3/ it’s not re-usable

You won’t go very far in business without 1 & 2 & 3, so you have to give up something…unless you are building embedded systems for military or algo trading for hedge funds

Last Edited by Ibra at 28 Jun 08:08
Paris/Essex, France/UK, United Kingdom

Well, Haskell has been around since 1990, so I would say it’s quite firmly established and no butterfly. it has a very precise language definition last revised in 2010. Certainly there are lots of experimental languages out there, but Haskell is not one of them. Lisp (which is also one of your silly academic esoterica) was designed in 1958.

I never said either of them was a butterfly. I stand by my statement that they are academic exotica (OK, I’ll drop silly). I’ve written programs in Lisp, a long while ago. I still contend that taking a typical good C/Python/Javascript programmer and expecting them to even understand either of those languages, never mind mind write code in them, is very unlikely to work well.

Butterflies are a different story. Look at Go. 10 years ago it was going to change the world yadda yadda yadda. Now I understand that even Google, its origin, has moved away from it.

LFMD, France

johnh wrote:

I still contend that taking a typical good C/Python/Javascript programmer and expecting them to even understand either of those languages, never mind mind write code in them, is very unlikely to work well.

My university (Uppsala) uses Haskell as the first language for Computer Science majors. That language is also used in the first data structure and algorithms course. Students have no difficulty understanding it. They also have no problem finding jobs – usually before they even graduate. (Yes, they are later also extensively studying imperative and object-oriented programming.)

Why do we do this? Because it allows students new to programming to focus on the algorithmic aspects of programming. This practise is not at all unusual at top universities. E.g. Oxford university also uses Haskell as its first language for CS majors.

Last Edited by Airborne_Again at 28 Jun 09:16
ESKC (Uppsala/Sundbro), Sweden

Everything you need to know is in the following. (Sorry for the long post, but it seems to have disappeared from mainstream websites).

A Brief, Incomplete, and Mostly Wrong History of Programming Languages

1801 – Joseph Marie Jacquard uses punch cards to instruct a loom to weave “hello, world” into a tapestry. Redditers of the time are not impressed due to the lack of tail call recursion, concurrency, or proper capitalization.

1842 – Ada Lovelace writes the first program. She is hampered in her efforts by the minor inconvenience that she doesn’t have any actual computers to run her code. Enterprise architects will later relearn her techniques in order to program in UML.

1936 – Alan Turing invents every programming language that will ever be but is shanghaied by British Intelligence to be 007 before he can patent them.

1936 – Alonzo Church also invents every language that will ever be but does it better. His lambda calculus is ignored because it is insufficiently C-like. This criticism occurs in spite of the fact that C has not yet been invented.

1940s – Various “computers” are “programmed” using direct wiring and switches. Engineers do this in order to avoid the tabs vs spaces debate.

1957 – John Backus and IBM create FORTRAN. There’s nothing funny about IBM or FORTRAN. It is a syntax error to write FORTRAN while not wearing a blue tie.

1958 – John McCarthy and Paul Graham invent LISP. Due to high costs caused by a post-war depletion of the strategic parentheses reserve LISP never becomes popular1. In spite of its lack of popularity, LISP (now “Lisp” or sometimes “Arc”) remains an influential language in “key algorithmic techniques such as recursion and condescension”2.

1959 – After losing a bet with L. Ron Hubbard, Grace Hopper and several other sadists invent the Capitalization Of Boilerplate Oriented Language (COBOL) . Years later, in a misguided and sexist retaliation against Adm. Hopper’s COBOL work, Ruby conferences frequently feature misogynistic material.

1964 – John Kemeny and Thomas Kurtz create BASIC, an unstructured programming language for non-computer scientists.

1965 – Kemeny and Kurtz go to 1964.

1970 – Guy Steele and Gerald Sussman create Scheme. Their work leads to a series of “Lambda the Ultimate” papers culminating in “Lambda the Ultimate Kitchen Utensil.” This paper becomes the basis for a long running, but ultimately unsuccessful run of late night infomercials. Lambdas are relegated to relative obscurity until Java makes them popular by not having them.

1970 – Niklaus Wirth creates Pascal, a procedural language. Critics immediately denounce Pascal because it uses “x := x + y” syntax instead of the more familiar C-like “x = x + y”. This criticism happens in spite of the fact that C has not yet been invented.

1972 – Dennis Ritchie invents a powerful gun that shoots both forward and backward simultaneously. Not satisfied with the number of deaths and permanent maimings from that invention he invents C and Unix.

1972 – Alain Colmerauer designs the logic language Prolog. His goal is to create a language with the intelligence of a two year old. He proves he has reached his goal by showing a Prolog session that says “No.” to every query.

1973 – Robin Milner creates ML, a language based on the M&M type theory. ML begets SML which has a formally specified semantics. When asked for a formal semantics of the formal semantics Milner’s head explodes. Other well known languages in the ML family include OCaml, F#, and Visual Basic.

1980 – Alan Kay creates Smalltalk and invents the term “object oriented.” When asked what that means he replies, “Smalltalk programs are just objects.” When asked what objects are made of he replies, “objects.” When asked again he says “look, it’s all objects all the way down. Until you reach turtles.”

1983 – In honor of Ada Lovelace’s ability to create programs that never ran, Jean Ichbiah and the US Department of Defense create the Ada programming language. In spite of the lack of evidence that any significant Ada program is ever completed historians believe Ada to be a successful public works project that keeps several thousand roving defense contractors out of gangs.

1983 – Bjarne Stroustrup bolts everything he’s ever heard of onto C to create C++. The resulting language is so complex that programs must be sent to the future to be compiled by the Skynet artificial intelligence. Build times suffer. Skynet’s motives for performing the service remain unclear but spokespeople from the future say “there is nothing to be concerned about, baby,” in an Austrian accented monotones. There is some speculation that Skynet is nothing more than a pretentious buffer overrun.

1986 – Brad Cox and Tom Love create Objective-C, announcing “this language has all the memory safety of C combined with all the blazing speed of Smalltalk.” Modern historians suspect the two were dyslexic.

1987 – Larry Wall falls asleep and hits Larry Wall’s forehead on the keyboard. Upon waking Larry Wall decides that the string of characters on Larry Wall’s monitor isn’t random but an example program in a programming language that God wants His prophet, Larry Wall, to design. Perl is born.

1990 – A committee formed by Simon Peyton-Jones, Paul Hudak, Philip Wadler, Ashton Kutcher, and People for the Ethical Treatment of Animals creates Haskell, a pure, non-strict, functional language. Haskell gets some resistance due to the complexity of using monads to control side effects. Wadler tries to appease critics by explaining that “a monad is a monoid in the category of endofunctors, what’s the problem?”

1991 – Dutch programmer Guido van Rossum travels to Argentina for a mysterious operation. He returns with a large cranial scar, invents Python, is declared Dictator for Life by legions of followers, and announces to the world that “There Is Only One Way to Do It.” Poland becomes nervous.

1995 – At a neighborhood Italian restaurant Rasmus Lerdorf realizes that his plate of spaghetti is an excellent model for understanding the World Wide Web and that web applications should mimic their medium. On the back of his napkin he designs Programmable Hyperlinked Pasta (PHP). PHP documentation remains on that napkin to this day.

1995 – Yukihiro “Mad Matz” Matsumoto creates Ruby to avert some vaguely unspecified apocalypse that will leave Australia a desert run by mohawked warriors and Tina Turner. The language is later renamed Ruby on Rails by its real inventor, David Heinemeier Hansson. [The bit about Matsumoto inventing a language called Ruby never happened and better be removed in the next revision of this article – DHH].

1995 – Brendan Eich reads up on every mistake ever made in designing a programming language, invents a few more, and creates LiveScript. Later, in an effort to cash in on the popularity of Java the language is renamed JavaScript. Later still, in an effort to cash in on the popularity of skin diseases the language is renamed ECMAScript.

1996 – James Gosling invents Java. Java is a relatively verbose, garbage collected, class based, statically typed, single dispatch, object oriented language with single implementation inheritance and multiple interface inheritance. Sun loudly heralds Java’s novelty.

2001 – Anders Hejlsberg invents C#. C# is a relatively verbose, garbage collected, class based, statically typed, single dispatch, object oriented language with single implementation inheritance and multiple interface inheritance. Microsoft loudly heralds C#’s novelty.

2003 – A drunken Martin Odersky sees a Reese’s Peanut Butter Cup ad featuring somebody’s peanut butter getting on somebody else’s chocolate and has an idea. He creates Scala, a language that unifies constructs from both object oriented and functional languages. This pisses off both groups and each promptly declares jihad.

Footnotes
Fortunately for computer science the supply of curly braces and angle brackets remains high.
Catch as catch can – Verity Stob

LFMD, France

Like I said, you can push any language “successfully” within a closed system that is big enough, because you then don’t need a volunteer buy-in. Academia is one such, or if you are a company with 101k employees you have an ecosystem where anything is possible. An independent small company has to work to very different criteria…

Like Pascal was pushed heavily in academia despite being almost totally unused (and thus useless) outside it, at the time. In fact Pascal never gained any critical mass outside academia, with the only people using it being graduates from the 1970s (and presumably the 1980s until the next “teaching language” took over). I know, because being one of that select elite I developed a user programmable protocol converter which had a built-in Pascal compiler. And a Wordstar compatible text editor running over an ANSI (VT100) compatible terminal

This thread is cracking me up, mostly because of the number of us pilots who come from a software engineering (or EE with SE experience) background. My airplane partner is also an engineer, which makes me wonder what share of pilots we engineers comprise

See Occupations I think @neil who started that thread 10 years ago is still around Yes there is a dominance of coders among pilots, especially IR holders, and that is completely unsurprising.

My biggest complain over ARM asm is the fact that it follows ARM architecture develoment which is always moving with their lineup. It’s been the case now for 20 years, and actually accelerating.

Indeed, but it’s funny because ARM is almost never programmed in asm. In fact the arm32 I have been coding, in C, for the last 1-2 years has special support for a total avoidance of asm even for interrupts. The only bits which must be done in asm are stuff like stack manipulation (necessary in an RTOS), or specific delays (hacks) where you want a minimum of say 0.3us and you want to be sure that both your existing C compiler and some future version isn’t going to break it. Even the startup can be done in C although I don’t like to do so, for various reasons e.g. unwanted code removal (that sort of aggressive optimisation is fully allowed) which asm code avoids.

IME, asm is almost never helpful to get speed (like it was in the old chips) – when you have say 15 DMA controllers which can be chained to timers, ADCs, DACs… You can build whole systems which don’t run any code You can build an arbitrary waveform gen which runs no code. Well, just the UI. You can have an ADC sampling at say 1MHz, off a timer trigger, and DMA shoving the stuff into a circular buffer, no code running. This stuff is designed for real time e.g. sophisticated 3 phase motor control. It’s a bastard to set up but once going you recycle the code for new projects. The arm32 cases where you tear your hair out over speed are peripheral access, because the peripherals run with a much slower clock and stuff can run unexpectedly slowly. A typical embedded arm32 system will be I/O speed limited, not by the speed at which the ~168MHz CPU executes plain old C code. Even a single float multiply takes 1 clock so you can be incredibly lazy.

I still contend that taking a typical good C/Python/Javascript programmer and expecting them to even understand either of those languages, never mind mind write code in them, is very unlikely to work well.

Sure. However, even the above is mixing-up applications. C is not going to be used client-side, and is almost never used server-side. Embedded is mostly C, except quick and dirty hobby projects for which Python is popular. Javascript is mostly client-side but can be server-side too, and almost never in embedded. And server coders rarely do embedded, or vice versa. These are different tools for different jobs. The biggest problem today IME is future support; if you code a website in Ruby you will struggle to find anybody who can maintain it 10 years later – unless you pay lots.

Fashions come and go. This is why the brief for the EuroGA airport database was to write it all in PHP, with minimal reliance on currently-fashionable libraries (whoops I means “frameworks”). I can get it maintained for $40/hr, and by lots of people (even though the original guy, in Poland, is still around, and doing other stuff for me currently.

Everything you need to know is in the following. (Sorry for the long post, but it seems to have disappeared from mainstream websites).

That’s hilarious and largely accurate

Administrator
Shoreham EGKA, United Kingdom



Andreas IOM

In academia one can use “teaching languages” like, in my univ days, Pascal

The first Saab Gripen flight control system was written in Pascal

I also learned Pascal at the university. Turbo Pascal. Never used it since. I am not a programmer, but I have coded several program for control systems and simulations. Mostly in LabVIEW, which no “real” programmer even consider a language Also some Fortran and C a long time ago.

More than 25 years ago I worked closely with a “real” programmer from Texas. Subsea stuff. He used assambler and C (in a mix) and absolutely hated LabVIEW. At that time he complained he couldn’t hire any young programmer because none of them could produce anything without a library of high level classes.

In engineering, C is probably the mostly used language (embedded stuff and similar). Then there are special purpose stuff and of course Python and other interpreted stuff, Matlab. Many still use Fortran also in numerical simulations. Usually it’s faster than C. LabVIEW is compiled, and the execution speed is close to C (factor of 1.5 -2 on average). It’s dataflow/functional with object oriented features. Lots of discussions how to classify it. Is it high level? In most aspects yes, but the dataflow paradigm allows it to compile directly on FPGA, fully parallell. Nobody cares really. What’s important is it gets the job done, and it’s done fast by very few people.

The elephant is the circulation
ENVA ENOP ENMO, Norway

A job does need to be done, but if you run your own business then you have a further strong interest in maintainability into the future, and you cannot rely on a given employee to be around say a year from now. This is not an issue if you are that employee; in fact perversely your interest is the opposite and esoteric solutions are all great for earnings. Your #1 job is to put bread on the table at home. The company comes second. Programmers have zero interest in a product running for 10 or 20 years with zero mods (because it was written well to start with) or with minimal mods (which were done effectively because the product was designed to be easy enough for others to get into it. That is rubbish for your CV, rubbish for your LinkedIn profile, and there is a risk of the job becoming boring if not enough products are being developed. In addition, almost every programmer hates opening up code written by others, so you struggle with that also.

The bottom line is quite boring:

  • for embedded, C (with an understanding of asm and hardware)
  • for server-side, PHP (unless you have lots of €€€ to pay)
  • for client-side, JS (unless you have lots of €€€ to pay)

The first Saab Gripen flight control system was written in Pascal

I think that if you dig around the right era – basically 1970s-1980s when lots of Pascal coders were coming out of univ – and you look in places where a big enough “internal ecosystem” could be enforced, you will find lots of Pascal. I recall a presentation many years ago where it was claimed that Pascal was one of the languages used by Airbus in the FBW system (while different languages were used for parallel paths, to avoid the same bugs being introduced) but can’t find a reference to it right now. In the 1980s, IAR hoped to capitalise on it but didn’t sell many and then the language died in the embedded world, returning on the PC platform with Borland and Delphi which was actually a very good product which was killed off by the might of M$ and VC++.

Administrator
Shoreham EGKA, United Kingdom

IIRC big chunks of Windows 3.x were in Pascal – look at all the PASCAL declarations you had to make in C code so it would use the Pascal calling convention (rather than the C calling convention).

Peter wrote:

Programmers have zero interest in a product running for 10 or 20 years with zero mods (because it was written well to start with)

Actually I do, because I’m fundamentally extremely lazy, and I don’t want to be fiddling around with yesterday’s news when there’s something more interesting to do today.

Andreas IOM

A job does need to be done, but if you run your own business then you have a further strong interest in maintainability into the future, and you cannot rely on a given employee to be around say a year from now.

Tell that to oil and energy companies… The job definitely needs to be done – by someone.

Typically in software written for hardware (embedded), it’s the availability and maintainability of hardware that causes problems in the long run, not software. You either have to make the hardware yourself, or use some economically stable supplier, like Siemens etc. Software can always be fixed somehow, but not a 30 year old circuit made by a long gone company, using long gone chips. It’s a real problem in the energy sector where guaranties are made for 30 years of operation.

I don’t think there is one good solution to that problem. What is done is to spread the risk, using different systems. Also, those companies (in the oil/energy and military too), have a good grip on the suppliers, and they pay well and purchase in large quantities.

In “technology”, internet etc, it seems to me it’s all driven by “fashion”. This year Java is “in”, the next year C++ is back and so on. It doesn’t seem to be very result oriented. But I guess there is tons and tons of “low level” programming in my phone which is nothing but result oriented, and I would think C and/or assembler is the only way to do it, as it has been the last 40 years or more.

The elephant is the circulation
ENVA ENOP ENMO, Norway
Sign in to add your message

Back to Top