image
Archives

0

What is the greatest design flaw you have faced in any programming language?


All programming languages are having their design flaws simply because not a single language can be perfect, just as with most (all?) other things. That aside, which design fault in a programming language has annoyed you the most through your history as a programmer?

Note that if a language is "bad" just because it isn't designed for a specific thing isn't a design flaw, but a feature of design, so don't list such annoyances of languages. If a language is illsuited for what it is designed for, that is of course a flaw in the design. Implementation specific things and under the hood things do not count either.



- Archives
image

Archives

0

The use of desktop inspired forms within asp.net.

It always felt a fudge and got in the way or the how the web actually works. Thankfully asp.net-mvc does not suffer in the same way, though with credit to Ruby etc for that inspiration.



- Archives
image

Archives

0

For me it is PHP's absolute lack of naming and argument ordering conventions in its standard library.

Though JASS's necessity to nullify references after the referenced object was released/removed (or the reference would leak and several bytes of memory would be lost) is more serious, but since JASS is single purpose language, it is not that critical.



- Archives
image

Archives

0

For me, it is the design problem that plagues all of the languages that were derived from C; namely, the "dangling else." This grammatical problem should have been resolved in C++, but it was carried forth into Java and C#.



- Archives
image

Archives

0

I find Javascript default to global to a major issue, and often source of bugs if you don't use JSLint or the like



- Archives
image

Archives

0
  1. signed chars in C - an abomination invented to let mathematicians have big collections of small items
  2. Using case to carry semantic content - again for mathematicians, who don't need to talk, and never have enough space for their formulas
  3. Let/Plain assignment vs. Set assignment in Basic dialects - no mathematicians involved here I think


- Archives
image

Archives

0

I know Perl best, so I'll pick on it.

Perl tried many ideas. Some were good. Some were bad. Some were original and not widely copied for good reason.

One is the idea of context - every function call takes place in list or scalar context, and can do entirely different things in each context. As I pointed out at http://use.perl.org/~btilly/journal/36756 this complicates every API, and frequently leads to subtle design issues in Perl code.

The next is the idea of tying syntax and data types so completely. This lead to the invention of tie to allow objects to masquerade as other data types. (You can also achieve the same effect using overload, but tie is the more common approach in Perl.)

Another common mistake, made by many languages, is to start off by offering dynamic scoping rather than lexical. It is hard to revert this design decision later, and leads to long-lasting warts. The classic description of those warts in Perl is http://perl.plover.com/FAQs/Namespaces.html. Note that this was written before Perl added our variables and static variables.

People legitimately disagree on static versus dynamic typing. I personally like dynamic typing. However it is important to have enough structure to let typos to be caught. Perl 5 does a good job of this with strict. But Perl 1-4 got this wrong. Several other languages have lint checkers that do the same thing as strict. As long as you are good about enforcing lint checking, that is acceptable.

If you're looking for more bad ideas (lots of them), learn PHP and study its history. My favorite past mistake (long ago fixed because it lead to so many security holes) was defaulting to allowing anyone to set any variable by passing in form parameters. But that is far from the only mistake.



- Archives
image

Archives

0

C and C++: All those integer types that don't mean anything.

Especially char. Is it text or is it a tiny integer? If it's text, is it an "ANSI" character or a UTF-8 code unit? If it's an integer, is it signed or unsigned?

int was intended to be the "native"-sized integer, but on 64-bit systems, it isn't.

long may or may not be larger than int. It may or may not be the size of a pointer. It's pretty much an arbitrary decision on the part of the compiler writers whether it's 32-bit or 64-bit.

Definitely a language of the 1970s. Before Unicode. Before 64-bit computers.



- Archives
image

Archives

0

Java's silent integer arithmetic overflow



- Archives
image

Archives

0

The greatest design flaw that I face is that python was not designed like python 3.x to begin with.



- Archives
image

Archives

0

One of my big annoyances is the way switch cases in C-derived languages default to falling through to the next case if you forget to use break. I understand that this is useful in very low level code (eg. Duff's Device), but it is usually inappropriate for application level code, and is a common source of coding errors.

I remember in about 1995 when I was reading about the details of Java for the first time, when I got to the part about the switch statement I was very disappointed that they had retained the default fall-through behaviour. This just makes switch into a glorified goto with another name.



- Archives
image

Archives

0

The choice of + in Javascript for both addition and string concatenation was a terrible mistake. Since values are untyped, this leads to byzantine rules that determine whether + will add or concatenate, depending on the exact content of each operand.

It would have been easy in the beginning to introduce a completely new operator such as $ for string concatenation.



- Archives
image

Archives

0

One of the biggest issues with BASIC was the lack of any well defined method to extend the language beyond it's early environments, leading to a bunch of completely incompatible implementations (and a nearly irrelevant post-facto attempt at any standardization).

Almost any language will get bent into general purpose use by some crazy programmer. It's better to plan for that general purpose usage at the beginning in case that crazy idea takes off.



- Archives
image

Archives

0

I've never really liked the use of = for assignment and == for equality testing in C-derived languages. The potential for confusion and errors is too high. And don't even get me started on === in Javascript.

Better would have been := for assignment and = for equality testing. The semantics could have been exactly the same as they are today, where assignment is an expression that also produces a value.



- Archives
image

Archives

0

I believe in DSLs (domain-specific-languages) and one thing I value in a language is if it allows me to define a DSL on top of it.

In Lisp there are macros - most people consider this a good thing, as do I.

In C and C++ there are macros - people complain about them, but I was able to use them to define DSLs.

In Java, they were left out (and therefore in C#), and the lack of them was declared to be a virtue. Sure it lets you have intellisense, but to me that's just an oeuvre. To do my DSL, I have to expand-by-hand. It's a pain, and it makes me look like a bad programmer, even though it lets me do a heck of a lot more with tons less code.



- Archives
image

Archives

0

One could list hundreds of mistakes in hundreds of language, but IMO it is not a useful exercise from a language design perspective.

Why?

Because something that would be a mistake in one language would not be a mistake in another language. For instance:

  • Making C a managed (i.e. garbage collected) language or tying down the primitive types would limit its usefulness as a semi-portable low-level language.
  • Adding C-style memory management to Java (for example, to address performance concerns) would break it.

There are lessons to be learned, but the lessons are rarely clear cut, and to understand them you have to understand the technical trade-offs ... and the historical context. (For instance, the cumbersome Java implementation of generics is a consequence of an overriding business requirement to maintain backwards compatibility.)

IMO, if you are serious about designing a new language, you need to actually use a wide range of existing languages (and study historical languages) ... and make up your own mind what the mistakes are. And you need to bear in mind that each of these languages was designed in a particular historical context, to fill a particular need.

If there are general lessons to be learned they are at the "meta" level:

  • You cannot design a programming language that is ideal for all purposes.
  • You cannot avoid making mistakes ... especially when viewed from hind-sight.
  • Many mistakes are painful to correct ... for users of your language.
  • You have to take account of the background and skills of your target audience; i.e. existing programmers.
  • You cannot please everyone.


- Archives
image

Archives

0

Mine has to be UMTA Specification Language, a macro language that translated into ANSI Fortran. USL use of blanks was hideous.

USL would allow a blank in a name. So instead of "LASTTANGO" you could name your macro "LAST TANGO". But this could also mean a macro "LAST" followed by another macro named "TANGO". Read code like "LAST TANGO IN PARIS" and the combinatorial possibilities are horrid.

USL did not use begin/end or {} to indicate subsidiary code, it used spacing. Following an IF statement, all lines that were indented more than the IF statement were conditional upon that IF. Sounds easy, eh? But try tracking conditionals through several pages; try adding an ELSE with exactly the right indentation.

USL was born and died in a U.S. government agency back around 1980.



- Archives
image

Archives

0

Array decay in C and consequently C++.



- Archives
image

Archives

0

The preprocessor in C and C++ is a massive kludge, creates abstractions that leak like sieves, encourages spaghetti code via rat's nests of #ifdef statements, and requires horribly unreadable ALL_CAPS names to work around its limitations. The root of these problems is that it operates at the textual level rather than the syntactic or semantic level. It should have been replaced with real language features for its various use cases. Here are some examples, though admittedly some of these are solved in C++, C99 or unofficial but de facto standard extensions:

  • #include should have been replaced with a real module system.

  • Inline functions and templates/generics could replace most of the function call use cases.

  • Some kind of manifest/compile time constant feature could be used for declaring such constants. D's extensions of enum work great here.

  • Real syntax tree level macros could solve a lot of miscellaneous use cases.

  • String mixins could be used for the code injection use case.

  • static if or version statements could be used for conditional compilation.



- Archives
image

Archives

0

C++'s making objects value types instead of reference types, and thereby ruining the chance to implement object-oriented programming in any sane way. (Inheritance and polymorphism simply don't mix with statically-allocated, fixed-size, pass-by-value data structures.)



- Archives
image

Archives

0

Both Java and C# have annoying problems with their type systems due to the desire to maintain backwards compatibility while adding generics. Java doesn't like mixing generics and arrays; C# won't allow some useful signatures because you can't use value types as bounds.

As an example of the latter, consider that

public static T Parse<T>(Type<T> type, string str) where T : Enum
alongside or replacing
public static object Parse(Type type, string str)
in the Enum class would allow
MyEnum e = Enum.Parse(typeof(MyEnum), str);
rather than the tautological
MyEnum e = (MyEnum)Enum.Parse(typeof(MyEnum), str);

tl;dr: think about parametric polymorphism when you start designing your type system, not after you publish version 1.



- Archives
image

Archives

0

JavaScripts ambiguity for code blocks and object literals.

  {a:b}

could be a code block, where a is a label and b is an expression; or it could define an object, with an attribute a which has the value b



- Archives
image

Archives

0

Classes in C++ are some kind of forced design pattern in the language.

There is practically no difference at runtime between a struct and a class, and it is so confusing to understand what is the real true programming advantage of "information hiding" that I want to put it there.

I'm going to be downvoted for that, but anyway, C++ compilers are so hard to write this language feels like a monster.



- Archives
image

Archives

0

The decision to implement the goto operator in PHP 5.3.
What reason could there possibly be to encourage bad programming practices that were -wisely- not implemented in previous versions?



- Archives
image

Archives

0

I think all the answers so far point to a single failing of many mainstream languages:

There is no way to change the core language without affecting backward compatibility.

If this is solved then pretty much all those other gripes can be solved.

EDIT.

this can be solved in libraries by having different namespaces, and you could conceive of doing something similar for most of the core of a language, though this might then mean you need to support multiple compilers/interpreters.

Ultimately I don't think I know how to solve it in a way that is totally satisfactory, but that doesn't mean a solution doesn't exist, or that more can't be done



- Archives
image

Archives

0

I'm going to go back to FORTRAN and whitespace insensitivity.

It pervaded the specification. The END card had to be defined as a card with an 'E', an 'N', and a 'D' in that order in columns 7-72, and no other nonblanks, rather than a card with "END" in the proper columns and nothing else.

It led to easy syntactic confusion. DO 100 I = 1, 10 was a loop control statement, while DO 100 I = 1. 10 was a statement that assigned the value 1.1 to a variable called DO10I. (The fact that variables could be created without declaration, their type depending on their first letter, contributed to this.) Unlike other languages, there was no way to use spaces to separate out tokens to allow disambiguation.

It also allowed other people to write really confusing code. There's reasons why this feature of FORTRAN was never duplicated ever again.



- Archives
image

Archives

0

Lack of array handling for input variables in SQL. Even when a database vendor has added on a feature to support this (I'm looking at you, SQL Server), it can be poorly designed and awkward to use.



- Archives
image

Archives

0

Delphi / Pascal language don't allow multiline strings without using concatenation.



- Archives
image

Archives

0

Statements, in every language that has them. They do nothing that you can't do with expressions and prevent you from doing lots of things. The existence of a ?: ternary operator is just one example of having to try to get around them. In JavaScript, they are particularly annoying:

// With statements:
node.listen(function(arg) {
  var result;
  if (arg) {
    result = 'yes';
  } else {
    result = 'no';
  }
  return result;
})

// Without:
node.listen(function(arg) if (arg) 'yes' else 'no')


- Archives
image

Archives

0

Perl's flattening of lists... Conversely, it's also one of it's best features.



- Archives
image

Archives

0

I get the feeling that the people who designed PHP didn't use a normal keyboard, they don't even use a colemak keyboard, because they should have realized what they were doing.

I am a PHP developer. PHP isn't fun to type.

Who::in::their::right::mind::would::do::this()? The :: operator requires holding shift and then two key presses. What a waste of energy.

Although->this->is->not->much->better. That also requires three key presses with the shift being in between the two symbols.

$last = $we.$have.$the.$dumb.'$'.$character. The dollar sign is used a tremendous amount of times and requires the award stretch up to the very top of the keyboard plus a shift key press.

Why couldn't they design PHP to use keys that are much faster to type? Why couldn't we.do.this() or have vars start with a key that only requires a single keypress - or non at all (JavaScript) and just pre-define all vars (like I have to do for E_STRICT anyway)!

I'm no slow typist - but this is just a lame design choice.



- Archives
image

Archives

0

I feel like I'm opening myself up to get flamed, but I really hate the ability to pass plain old data types by reference in C++. I only slightly hate being able to pass complex types by reference. If I'm looking at a function:

void foo()
{
    int a = 8;
    bar(a);
}

From the calling point, there is no way to tell that bar, which may be defined in a completely different file, is:

void bar(int& a)
{
    a++;
}

Some might argue that doing something like this may just be a bad software design, and not to blame the language, but I don't like that the language lets you do this in the first place. Using a pointer and calling

bar(&a);

is much more readable.



- Archives
image

Archives

0

Javascript's omission of many of the date formatting and manipulation functions that almost every other language (including most SQL implementations) have has been a source of frustration for me recently.



- Archives
image

Archives

0

null.

Its inventor, Tony Hoare, calls it the "billion dollar mistake".

It was introduced in ALGOL in the 60s, and exists in most of the commonly used programming languages today.

The better alternative, used in languages like OCaml and Haskell, is the maybe. The general idea is that object references cannot be null/empty/non-existent unless there's an explicit indication that they may be so.

(Although Tony's awesome in his modesty, I think almost anyone would have made the same mistake, and he just happened to be first.)



- Archives
image

Archives

0

The worst sin of a programming language is not being well defined. A case I remember is C++, which, in its origins:

  1. Was so ill defined that you could not get a program to compile and run by following books or examples.
  2. Once you tweaked the program to compile and run under one compiler and OS, you'd have to start over if you switched compilers or platforms.

As I remember, it took about a decade to get C++ defined well enough as to make it as professionally dependable as C. It's something that should never happen again.

Something else I consider a sin (should it go in a different answer?) is having more than one "best" way to do some common task. It is the case of (again) C++, Perl, and Ruby.



- Archives
image

Archives

0

ALTER

When I learned COBOL, the ALTER statement was still a part of the standard. In a nutshell, this statement would allow you to modify procedure calls during runtime.

The danger was that you could put this statement in some obscure section of code that was rarely accessed and it had the potential to completely change the flow of the rest of your program. With multiple ALTER statements you could make it nearly impossible to know what your program was doing at any point in time.

My university instructor, very emphatically, stated that if he ever saw that statement in any of our programs he would automatically flunk us.



- Archives
image

Archives

0

Java, PHP, C#, Delphi, Vala: mixing "pointers to objects" versus "plain objects", what is usually called "references". In C++, and Object Pascal, you can create objects as static variables, and objects as dynamic variables, using pointer syntax.

Example (C++) :

x = new XClass();   
x->y = "something";
if (x == null) {
  x->doSomething();
}

Example (Java / C#) :

x = new XClass();
x.y = "something";
if (x == null) {
  x.doSomething();
}


- Archives
image

Archives

0

In old Ultima Online/sphere server scripting...there was no divide at all or decimal points, though the game itself obviously used them. You could make a somewhat crude divide function but just barely.

It is difficult to say how much of a train wreck that one flaw made the whole language extremely cumbersome.



- Archives
image

Archives

0

In non-Java like languages. The concept of "GOTO" or Jumping. Being able to jump around the code in a non-linear way is perhaps the most ill-logical feature in any written language.

This has been the most misused and irrelevant concept. Once I see one of these in a 300 line function I know I'm in for a cooking lesson for spaghetti. The exception is error handling. This is the only acceptable use of the concept of jumping

It breaks good modern programming practices. Goto's are only acceptable for the purpose of error trapping. Not a lazy way to terminate loops or skip code.

Writing code is an art form that should is oriented for readability. Among many aspects for readability is linear execution. One entry point, one exit for any function, and it must flow down the page, no jumps or goto's.

Not only does this make the code more readable, it also by it's very nature helps you write higher quality code. One hack begets another and another. You'll generally find that once goto's are misued, you also get multiple exit statements out of functions. Tracing conditions and logic for testing becomes infinitely more difficult and immediately reduces the robustness of any code you may produce.

I wish Goto's would be banished forever, they were used in Assembly coding years ago, and that is where they should remain.



- Archives
image

Archives

0

The biggest flaw often seen in many programming languages is inability to 'bootstrap' - often it's not possible or practical to implement the language and its libraries using only that language itself.

When this is the case, language designers (and implementors) are skipping the hard part of making the language useful for really interesting and challenging tasks.



- Archives
image

Archives

0

Optional paramaters in VB. When adding features to code is is too easy to do it as an optional parameter, then another, then another, so you end up with all these parameters that are only used in newer cases that were added after the initial code was written, and probably aren't called at all by older code.

Thankfully this was solved for me by switching to C# and using overloads.



- Archives
image

Archives

0

RoboCom, whose assembly language lack for bitwise operations.

While it hardly counts as any productive language with any real value other than learning and entertainment, RoboCom is a game where you are programming virtual robots to participate in a code battles. Programmers have to make most use of clock cycles to make their move before their opponent do.

If a language is illsuited for what it is designed for, that is of course a flaw in the design.

I found it quite irritating for a language to lack bitwise operations, especially when the goal of the game is elimination by optimization. That in my books, is a real design flaw, since many optimizations can be made using bitwise operations.

Wish I could have contributed something slightly more useful. :)



- Archives
image

Archives

0

Although every language has it's faults, none are nuisances once you know about them. Except for this pair:

Complex syntax coupled with a wordy APIs

This is particularly true of a language like Objective-C. Not only is the syntax overwhelmingly complex but the API uses function names like:

- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath

I'm all for being explicit and un-ambiguous, but this is ridiculous. Each time I sit down with xcode, I feel like a n00b, and that's really frustrating.



- Archives
image

Archives

0

Primitive types in Java.

They break the principle that everything is a descendant of java.lang.Object, which from a theoretical point of view leads to additional complexity of the language specification, and from a practical perspective they make the use of collections extremely tedious.

Autoboxing helped alleviate the practical drawbacks but at the cost of making the specification even more complicated and introducing a big fat banana skin: now you can get a null pointer exception from what looks like a simple arithmetic operation.



- Archives
image

Archives

0

Cramming array, list and dictionary into one abomination called PHP's "associative arrays".



- Archives
image

Archives

0

I have two things which I dislike in C++,

Implicit conversion to bool (this is made worse by the fact that you can implement conversion operators for types such as operator int() const {...})

Default parameters, yes it aids backward compatibility of interfaces when adding new stuff, but then so does overloading - so why do this?

Now combine the two together you have a recipe for disaster.



- Archives
image

Archives

0

In C, and inherited by C++, Java, and C#:

You cannot parse the language without building a symbol table. Because code like this:

Foo Bar();

could be declaring a variable Bar of type Foo and calling the default constructor, or could be declaring a function Bar that returns a Foo and takes no arguments. (In C++, anyway. The others have similar flaws.)

This means you can't parse these languages without building a symbol table, making analysis tools much harder to write.

(Javascript and Go manage to avoid this problem.)



- Archives
image

Archives

0

Coldfusion's nested CFLoop behavior.



- Archives
image

Archives

0

The m4 macro language lacks a loop construct. Instead, the documentation recommends that you create your own using recursion. That kept me from bothering with the language, pretty much for ever.



- Archives
image

Archives

0

Probably not the greatest design flaw but a flaw nonetheless in my opinion...

The keyword final in the context of variables is Java.

I wish there was a non-final / mutable / rassignable keyword instead to indicate that the reference to a variable can be changed.

Good practices suggest that you should use the keyword final liberally in all cases especially when you are writing multi-threaded applications.

I follow this advice and use the final keyword whenever I can. It is very rare that I need a variable to be non-final. This makes my source code a lot more cluttered and forces me to type more than I really should have to.



- Archives
Cancel