I was wondering why pointers are not included in modern languages now-a-days. I already did research on this on internet, and found out few theories/reasons:
- Memory leakage is the biggest concern while using pointers.
- If not handled properly it may ruin whole project or application.
- Understanding and using pointers in best way possible requires a lot of time.
Well there are few problems but if some work is done properly on pointers i.e., exception handling etc. I think pointers can still be very helpful where required. Kindly help me out on this one.
3
I’m not sure where you get the idea that modern languages don’t have pointers. In Ruby, for example, everything is a pointer. It’s true that Ruby doesn’t have special syntax or special operations for pointers, but that doesn’t mean that there are none. Quite the opposite, in fact: because everything is a pointer, there is no need to distinguish between pointers and non-pointers, pointer operations and non-pointer operations. Pointers are so deeply ingrained in the language that you don’t even see them.
The same is true for Python, Java, ECMAScript, Smalltalk, and many other languages.
What those languages don’t support, is pointer arithmetic or fabricating a pointer out of thin air. But then again, some CPUs don’t allow that either.
The original CISC CPU for the AS/400 distinguishes between pointers and integers. You can store pointers and you can dereference pointers, but you cannot create or modify pointers. The only way to get a pointer is if the kernel hands one to you. If you try to do arithmetic on it, you get back an integer, which cannot be converted to or used as a pointer. Even the modern PowerPC and POWER CPUs have a special tagged address mode specifically for running OS/400 / i5/OS / IBM i.
Go has pointers in the more traditional sense, like C. But it also doesn’t allow pointer arithmetic.
Other languages have pointers and pointer arithmetic, but a set of restrictions that ensure that pointers are always valid, always point to initialized memory, and always point to memory that is owned by the entity performing the arithmetic.
4
Almost all modern programming languages use indirection extensively under the hood – any instance of a Java type that’s derived from Object
is referenced through a pointer (or pointer-like object), for example.
The difference is that those programming languages don’t expose any pointer types or operations on pointer values to the programmer. You can’t take the address of a Java Object
instance and examine it directly, nor can you use it to offset an arbitrary number of bytes into the instance (even though the JVM does so internally). The language simply doesn’t provide any mechanism for the programmer to do so. It doesn’t define a method or operator to obtain an object’s address; it doesn’t define a method or operator to examine the contents of an arbitrary address; it doesn’t define the binary +
or -
operators to work with address types. The []
operator doesn’t just offset from a base address; it’s smart enough to throw an exception if you attempt to index past the end of the array.
Remember that C was developed (at least in part) to implement the Unix operating system; since any OS needs to manage memory, the language needed to provide operations on memory addresses as well as other types.
C became popular for applications programming because C compilers were small, fast, and produced fast code. Being able to manipulate memory directly sped up a number of operations. Unfortunately, being able to manipulate memory directly also opened up a huge can of worms with respect to security, correctness, etc. Everything from the Morris worm to the Heartbleed bug was enabled by C’s ability to manipulate memory. Also, C’s pointer syntax could be confusing, especially since unary *
has lower precedence than postfix operators like []
, ()
, ++
, .
, ->
, etc. The fact that array expressions “decay” to pointer types also leads to problems for people who don’t really know the language that well.
So modern programming languages don’t expose pointers the way C does to avoid many of these problems. However, note that most of C’s contemporaries (Pascal, Fortran, BASIC, etc.) didn’t expose operations on pointer values either, even though they used pointer-like semantics under the hood (passing arguments by reference, COMMON blocks, etc.).
1