[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Some GNUstep discussions in other forums

From: David Chisnall
Subject: Re: Some GNUstep discussions in other forums
Date: Thu, 27 Dec 2018 11:15:29 +0000
User-agent: Mozilla/5.0 (Windows NT 10.0; WOW64; rv:60.0) Gecko/20100101 Thunderbird/60.3.3

On 26/12/2018 16:08, Patryk Laurent wrote:
Hi David,

a language (which is somewhat dated even in its latest incarnation).

I would love to know your thoughts with respect to that point, if you'd care to share (off list if you'd prefer). Or might you have a talk/article you could point me to?

A programming language is intended as a compromise between the abstract algorithms in the programmer's head and the concrete hardware on which it runs. Ideally, it should be easy to map in both directions: from the human, to the programming language, and then to the target hardware. Roughly speaking, high-level language is one that optimises for the human-to-programming-language translation, a low-level language is one that optimises for the programming-language-to-machine translation.

Objective-C is a good late '80s programming language. It has an abstract machine that is very close to late '80s hardware: flat memory, a single processor. This isn't the world that we're currently living in. Computers have multiple, heterogeneous, processors. My ancient phone (Moto G, first generation) has four ARM cores, some GPU cores, and a bunch of specialised accelerators. It has a fairly simple memory hierarchy, but my laptop and desktop both have 3 layers of caches between the CPU cores and main memory and have private DRAM attached to the GPU.

A modern language has to expose an abstract machine that's similar to this. A good language also has to remove mechanical work from the programmer. Objective-C does some nice things with reflection here: for example, you need a lot less boilerplate in Objective-C to wire up a GUI to its controller than in Java.

Objective-C more or less gives you memory safety if you avoid the C subset of the language (though that's pretty hard). Unfortunately, even if you avoid C in your own code, all non-trivial Objective-C programs link to a load of complex (and, therefore, buggy) C libraries. They have no protection from these libraries: a single pointer bug in the C code can violate all of the invariants that the Objective-C runtime depends on (as you can see from a lot of previous posts in this list).

Modern Objective-C, with ARC, at least gives you temporal memory safety, though it also gives you memory leaks if you have cyclic data structures and don't explicitly break memory cycles. Classes such as NSArray give you spatial memory safety if you use them instead of C arrays (and don't call methods like -data). With Objective-C++, you can use lower-overhead things like std::string and std::vector for primitive types and get memory safety if you use .at() instead of operator[], but it's somewhat clunky (memory safety is possible, it isn't the easiest option).

C++ has evolved a lot in the last 7 years. With std::shared_ptr and std::unique_ptr, you get the same level of memory safety as ARC, with similar overheads. ARC integrates nicely with Objective-C++, so you can put Objective-C object pointers into C++ structs safely (including, for example, having a std::vector<id>). Objective-C and C++ have very different strengths: Objective-C provides high-level abstractions for late binding, C++ provides tight coupling for low-level compile-time specialised data structures.

If you have to write Objective-C now, I'd recommend Objective-C++ with ARC as the default base language. It's no surprise that this was Microsoft's choice for WinObjC and apparently Apple also uses Objective-C++ extensively in their own frameworks. GNUstep is somewhat crippled by using neither ARC nor Objective-C++ internally. Both significantly improve developer productivity.

The three big challenges in language design for modern requirements are:

- Concurrency (including heterogeneous multiprocessing)
- Error handling
- Safe isolation (sandboxing / compartmentalisation)

Be suspicious of any 'new' language that doesn't have a good story for all of these. If you can't express the idea of a graph of objects that the GPU now has exclusive access to, then your language isn't suitable for modern hardware. If you have to think about memory safety and can't easily integrate with sandboxed C libraries, then it isn't suitable for modern security requirements.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]