Most often, in a general-purpose imperative language – semicolons as statement-delimiters are either required, or completely disallowed (e.g. C and Python).
However, some languages, like JavaScript, allow you to opt out of delimiting your statements with semicolons, in favor of other delimiters (such as a newline).
What are the design decisions behind this? I understand that semicolons are essential when writing multiple statements on the same line, but is there another reason to make them mandatory (except following C)?
6
Making them mandatory (or disallowing them completely) reduces the number of corner cases, eliminates a potential source of obscure bugs, and simplifies the compiler/interpreter design.
The language designers who have opted to make them optional have
chosen to live with the ambiguity in return for greater syntactic flexibility.
10
JavaScript has shown us that this is a very bad idea. For example:
return
0;
In C, this returns a value of 0. In JavaScript, this returns undefined
because a semicolon gets inserted after the return statement, and it’s not immediately obvious why your code is breaking unless you happen to know about the details of automatic semicolon insertion.
8
It simplifies your grammar and parser somewhat to make the semicolons mandatory. Essentially, it allows the lexer to dump all the whitespace, including newlines, and the parser doesn’t have to worry about it at all.
On the other hand, once you start wanting to tell the parser about whitespace anyway, it’s not that hard to make the semicolons optional. You can often just lump them together with a whitespace
token and your parser can handle it just fine.
For example, try inserting the semicolons into the following series of C statements.
functionCall(3, 4) 9 + (3 / 8) variable++ while(1) { printf("Hello, worldn") }
While there are some weird things you can no longer do, like while(1);
, for the most part, it’s relatively easy with modern parsing techniques to determine where the statements end without a specific delimiter. Even if you still want to allow the weird stuff, it’s not that hard to make a newline_or_semicolon
non-terminal.
1
Semicolons are useful in a grammar for 2 reasons. First, it lets you split long statements into multiple lines without having godawful continuation characters (I’m talking about you, Fortran and Basic). Second, it let’s the parser have a way to “give up” parsing when the syntax gets really convoluted because of a typo. Stealing from Karl Bielefeldt’s example,
functionCall(3, 4) 9 + (3 / 8) variable++ while(1) { printf("Hello, worldn") }
imagine you typed one extra open paren:
functionCall((3, 4) 9 + (3 / 8) variable++ while(1) { printf("Hello, worldn") }
now where is the mistake? If you had the semicolons, it is easier for the parser to give up at the first semicolon. It could even continue parsing after the semicolon if it wanted to.
functionCall((3, 4); <- something is wrong here. emit error and keep going.
9 + (3 / 8); variable++; while(1) { printf("Hello, worldn"); }
Now it is easier on the parser to report an error, and easier to locate the line/column where it occurred.
1
Semicolons are not always all-or-nothing like you mention in your question. For example, Lua’s grammar is carefully designed to be free form (all whitespace, including newlines, can be ignored) but also without needing to use any semicolons. For example, the following programs are equivalent:
--One statement per line
x = 1
y = 2
--Multiple statements per line
x = 1 y = 2
--You can add semicolons if you want but its just for clarity:
x = 1; y = 2
All design and construction aside, I believe that a lot of programmers come from different backgrounds and some learned to use the semi-colon and some didn’t. A lot of newer languages that are emerging aren’t requiring a semi-colon but still allow it to exist. I think it might just be a way of getting more programmers to learn how to code in these new languages without having to give up their habits from when they began.