Discussion:
[fonc] Stephen Wolfram on the Wolfram Language
Tim Olson
2014-09-24 22:20:00 UTC
Permalink
Interesting talk by Stephen Wolfram at the Strange Loop conference:



He goes in the direction of creating a “big” language, rather than a small kernel that can be built upon, like Smalltalk, Maru, etc.

— tim
Reuben Thomas
2014-09-24 22:32:36 UTC
Permalink
Post by Tim Olson
http://youtu.be/EjCWdsrVcBM
He goes in the direction of creating a “big” language, rather than a small
kernel that can be built upon, like Smalltalk, Maru, etc.
Smalltalk and Maru are rather different: Ian Piumarta would argue, I
suspect, that the distinction between "small" and "large" languages is an
artificial one imposed by most languages' inability to change their syntax.
Smalltalk can't, but Maru can. Here we see Ian making Maru understand
Smalltalk, ASCII state diagrams, and other things:



That's the sort of small kernel you could build Wolfram on.

Racket is a production-quality example of the same thing:
http://racket-lang.org
--
http://rrt.sc3d.org
David Leibs
2014-09-24 23:39:21 UTC
Permalink
I think Stephen is misrepresenting the Wolfram Language when he says it is a big language. He is really talking about the built in library which is indeed huge. The language proper is actually simple, powerful, and lispy.
-David
Post by Tim Olson
http://youtu.be/EjCWdsrVcBM
He goes in the direction of creating a “big” language, rather than a small kernel that can be built upon, like Smalltalk, Maru, etc.
http://youtu.be/EGeN2IC7N0Q
That's the sort of small kernel you could build Wolfram on.
Racket is a production-quality example of the same thing: http://racket-lang.org
--
http://rrt.sc3d.org
_______________________________________________
fonc mailing list
http://vpri.org/mailman/listinfo/fonc
BGB
2014-09-25 15:05:42 UTC
Permalink
Post by David Leibs
I think Stephen is misrepresenting the Wolfram Language when he says
it is a big language. He is really talking about the built in library
which is indeed huge. The language proper is actually simple,
powerful, and lispy.
-David
I think it is partly size along two axes:
core features built into the language core and the languages' core syntax;
features that can be built on top of the language via library features
and extensibility mechanisms.

a lot of mainstream languages have tended to be bigger in terms of
built-in features and basic syntax (ex: C++ and C#);
a lot of other languages have had more in terms of extensibility
features, with less distinction between library code and the core language.

of course, if a language generally has neither, it tends to be regarded
as a "toy language".

more so if the implementation lacks sufficient scalability to allow
implementing a reasonable sized set of library facilities (say, for
example, if it always loads from source and there is a relatively high
overhead for loaded code).


sometimes, it isn't so clear cut as "apparent
complexity"=="implementation complexity".

for example, a more complex-looking language could reduce down somewhat
with a simpler underlying architecture (say, the language is itself
largely syntax sugar);
OTOH, a simple looking language could actually have a somewhat more
complicated implementation (say, because a lot of complex analysis and
internal machinery is needed to make it work acceptably).

in many cases, the way things are represented in the high-level language
vs nearer the underlying implementation may be somewhat different, so
the representational complexity may be being reduced at one point and
expanded at another.


another related factor I have seen is whether the library API design
focuses more on core abstractions and building things from these, or
focuses more on a large number of specific use-cases. for example, Java
having classes for nearly each and every way they could think up that a
person might want to read/write a file, as opposed to, say, a more
generic multipurpose IO interface.


generally, complexity has tended to be less of an issue than utility and
performance though.
for most things, it is preferable to have a more useful language if
albeit at the cost of a more complex compiler, at least up until a point
where the added complexity outweighs any marginal gains in utility or
performance.

where is this point exactly? it is subject to debate.
Post by David Leibs
Post by Tim Olson
http://youtu.be/EjCWdsrVcBM
He goes in the direction of creating a "big" language, rather
than a small kernel that can be built upon, like Smalltalk, Maru,
etc.
Smalltalk and Maru are rather different: Ian Piumarta would argue, I
suspect, that the distinction between "small" and "large" languages
is an artificial one imposed by most languages' inability to change
their syntax. Smalltalk can't, but Maru can. Here we see Ian making
http://youtu.be/EGeN2IC7N0Q
That's the sort of small kernel you could build Wolfram on.
http://racket-lang.org <http://racket-lang.org/>
--
http://rrt.sc3d.org <http://rrt.sc3d.org/>
_______________________________________________
fonc mailing list
http://vpri.org/mailman/listinfo/fonc
_______________________________________________
fonc mailing list
http://vpri.org/mailman/listinfo/fonc
Chris Warburton
2014-09-25 17:03:28 UTC
Permalink
One issue which will be interesting for the Wolfram language is whether
it evolves or stagnates across the years and decades.

Would language features be allowed, if they disrupt some of the vast
library? For example, Java can still be regarded as two languages: one
that's thread-safe and one that's not.

Would the library maintain backwards-compatibility for deprecated,
obsolete features?

Will the language fragment into mutually-incompatible subsets?

Regards,
Chris

Loading...