Search Results

Search found 16752 results on 671 pages for 'multi language'.

Page 30/671 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >

  • What are practical guidelines for evaluating a language's "Turing Completeness"?

    - by AShelly
    I've read "what-is-turing-complete" and the wikipedia page, but I'm less interested in a formal proof than in the practical implications of being Turing Complete. What I'm actually trying to decide is if the toy language I've just designed could be used as a general-purpose language. I know I can prove it is if I can write a Turing machine with it. But I don't want to go through that exercise until I'm fairly certain of success. Is there a minimum set of features without which Turing Completeness is impossible? Is there a set of features which virtually guarantees completeness? (My guess is that conditional branching and a readable/writeable memory store will get me most of the way there) EDIT: I think I've gone off on a tangent by saying "Turing Complete". I'm trying to guess with reasonable confidence that a newly invented language with a certain feature set (or alternately, a VM with a certain instruction set) would be able to compute anything worth computing. I know proving you can building a Turing machine with it is one way, but not the only way. What I was hoping for was a set of guidelines like: "if it can do X,Y,and Z, it can probably do anything".

    Read the article

  • How do you read a file line by line in your language of choice?

    - by Jon Ericson
    I got inspired to try out Haskell again based on a recent answer. My big block is that reading a file line by line (a task made simple in languages such as Perl) seems complicated in a functional language. How do you read a file line by line in your favorite language? So that we are comparing apples to other types of apples, please write a program that numbers the lines of the input file. So if your input is: Line the first. Next line. End of communication. The output would look like: 1 Line the first. 2 Next line. 3 End of communication. I will post my Haskell program as an example. Ken commented that this question does not specify how errors should be handled. I'm not overly concerned about it because: Most answers did the obvious thing and read from stdin and wrote to stdout. The nice thing is that it puts the onus on the user to redirect those streams the way they want. So if stdin is redirected from a non-existent file, the shell will take care of reporting the error, for instance. The question is more aimed at how a language does IO than how it handles exceptions. But if necessary error handling is missing in an answer, feel free to either edit the code to fix it or make a note in the comments.

    Read the article

  • What is a Windows scripting language that: does not rely on .NET and offers the most OOP support and

    - by jJack
    What is a Windows scripting language that: does not rely on .NET and offers the most OOP support and has simplest deployment? It doesn't necessarily need to be a scripting language; It can be in the form of a compiled executable, however it needs to be self contained--only ONE file, no DLL's and it cannot be declared to "include" other files. I cannot rely on the user having any .NET installed and it needs to be able to run on Windows 7 64 bit. By "most OOP support", I basically mean anything that has better OOP support than VBScript. A little context: Everything I have done thus far is in VBScript and writes a bunch of data into an .html file, which in the end is to be viewed by Internet Explorer. It also zips up a bunch of directories and files. It heavily relies on accessing the registry, file-system, and WMI (I can probably do without accessing WMI though, as long as I have good registry access). I can bring myself to code in any language so long as it meets me ridonkulous requirements stated above. I look forward to some good answers from those more experienced than I.

    Read the article

  • What programming language do you wish would quietly retire? [closed]

    - by Gregory Higley
    This is the inverse of the "What programming language do you wish would catch on?" question. I was a Delphi programmer for many years, and I still appreciate its power, but I dislike verbose programming languages. So I would love to see Pascal put out to pasture. The same goes for BASIC in any form, despite the fact that it's the language I cut my teeth on. When I look at cathedrals of beauty like Haskell and REBOL, BASIC just makes me cringe. (VB.NET is tolerable, but barely. It has a few nice language features I'd like to see moved to C#.) My dislike of Pascal and VB.NET is subjective. They are powerful languages, but I dislike their syntax esthetically. Try to explain your reasoning, if you can, even if it's just "I don't like its syntax." This question is not meant to be a flame war, argumentative, or hateful. It's meant to be a straightforward, honest discussion of programmers' dislikes.

    Read the article

  • scheme vs common lisp: war stories

    - by SuperElectric
    There are no shortage of vague "Scheme vs Common Lisp" questions on StackOverflow, so I want to make this one more focused. The question is for people who have coded in both languages: While coding in Scheme, what specific elements of your Common Lisp coding experience did you miss most? Or, inversely, while coding in Common Lisp, what did you miss from coding in Scheme? I don't necessarily mean just language features. The following are all valid things to miss, as far as the question is concerned: Specific libraries. Specific features of development environments like SLIME, DrRacket, etc. Features of particular implementations, like Gambit's ability to write blocks of C code directly into your Scheme source. And of course, language features. Examples of the sort of answers I'm hoping for: "I was trying to implement X in Common Lisp, and if I had Scheme's first-class continuations, I totally would've just done Y, but instead I had to do Z, which was more of a pain." "Scripting the build process in Scheme project, got increasingly painful as my source tree grew and I linked in more and more C libraries. For my next project, I moved back to Common Lisp." "I have a large existing C++ codebase, and for me, being able to embed C++ calls directly in my Gambit Scheme code was totally worth any shortcomings that Scheme may have vs Common Lisp, even including lack of SWIG support." So, I'm hoping for war stories, rather than general sentiments like "Scheme is a simpler language" etc.

    Read the article

  • Scheme vs Common Lisp: war stories

    - by SuperElectric
    There are no shortage of vague "Scheme vs Common Lisp" questions on both StackOverflow and on this site, so I want to make this one more focused. The question is for people who have coded in both languages: While coding in Scheme, what specific elements of your Common Lisp coding experience did you miss most? Or, inversely, while coding in Common Lisp, what did you miss from coding in Scheme? I don't necessarily mean just language features. The following are all valid things to miss, as far as the question is concerned: Specific libraries. Specific features of development environments like SLIME, DrRacket, etc. Features of particular implementations, like Gambit's ability to write blocks of C code directly into your Scheme source. And of course, language features. Examples of the sort of answers I'm hoping for: "I was trying to implement X in Common Lisp, and if I had Scheme's first-class continuations, I totally would've just done Y, but instead I had to do Z, which was more of a pain." "Scripting the build process in my Scheme project got increasingly painful as my source tree grew and I linked in more and more C libraries. For my next project, I moved back to Common Lisp." "I have a large existing C++ codebase, and for me, being able to embed C++ calls directly in my Gambit Scheme code was totally worth any shortcomings that Scheme may have vs Common Lisp, even including lack of SWIG support." So, I'm hoping for war stories, rather than general sentiments like "Scheme is a simpler language" etc.

    Read the article

  • Changes in Language Punctuation [closed]

    - by Wes Miller
    More social curiosity than actual programming question... (I got shot for posting this on Stack Overflow. They sent me here. At least i hope here is where they meant.) Based on the few responses I got before the content police ran me off Stack Overflow, I should note that I am legally blind and neatness and consistency in programming are my best friends. A thousand years ago when I took my first programming class (Fortran 66) and a mere 500 years ago when I tokk my first C and C++ classes, there were some pretty standard punctuation practices across languages. I saw them in Basic (shudder), PL/1, PL/AS, Rexx even Pascal. Ok, APL2 is not part of this discussion. Each language has its own peculiar punctuation. Pascal's periods, Fortran's comma separated do loops, almost everybody else's semicolons. As I learned it, each language also has KEYWORDS (if, for, do, while, until, etc.) which are set off by whitespace (or the left margin) if, etc. Each language has function, subroutines of whatever they're called. Some built-in some user coded. They were set off by function_name( parameters );. As in sqrt( x ) or rand( y ); Lately, there seems to be a new set of punctuation rules. Especially in c++ where initializers get glued onto the end of variable declarations int x(0); or auto_ptr p(new gizmo); This usually, briefly fools me into thinking someone is declaring a function prototype or using a function as a integer. Then "if" and 'for' seems to have grown parens; if(true) for(;;), etc. Since when did keywords become functions. I realize some people think they ARE functions with iterators as parameters. But if "for" is a function, where did the arg separating commas go? And finally, functions seem to have shed their parens; sqrt (2) select (...) I know, I koow, loosening whitespace rules is good. Keep reading. Question: when did the old ways disappear and this new way come into vogue? Does anyone besides me find it irritating to read and that the information that the placement of punctuation used to convey is gone? I know full well that K&R put the { at the end of the "if" or "for" to save a byte here and there. Can't use that excuse here. Space as an excuse for loss of readability died as HDD space soared past 100 MiB. Your thoughts are solicited. If there is a good reason to do this, I'll gladly learn it and maybe in another 50 years I'll get used to it. Of course it's good that compilers recognize these (IMHO) typos and keep right on going, but just because you CAN code it that way doesn't mean you HAVE to, right?

    Read the article

  • Locale variables have no effect in remote shell (perl: warning: Setting locale failed.)

    - by Janning
    I have a fresh ubuntu 12.04 installation. When i connect to my remote server i got errors like this: ~$ ssh example.com sudo aptitude upgrade ... Traceback (most recent call last): File "/usr/bin/apt-listchanges", line 33, in <module> from ALChacks import * File "/usr/share/apt-listchanges/ALChacks.py", line 32, in <module> sys.stderr.write(_("Can't set locale; make sure $LC_* and $LANG are correct!\n")) NameError: name '_' is not defined perl: warning: Setting locale failed. perl: warning: Please check that your locale settings: LANGUAGE = (unset), LC_ALL = (unset), LC_TIME = "de_DE.UTF-8", LC_MONETARY = "de_DE.UTF-8", LC_ADDRESS = "de_DE.UTF-8", LC_TELEPHONE = "de_DE.UTF-8", LC_NAME = "de_DE.UTF-8", LC_MEASUREMENT = "de_DE.UTF-8", LC_IDENTIFICATION = "de_DE.UTF-8", LC_NUMERIC = "de_DE.UTF-8", LC_PAPER = "de_DE.UTF-8", LANG = "en_US.UTF-8" are supported and installed on your system. perl: warning: Falling back to the standard locale ("C"). locale: Cannot set LC_ALL to default locale: No such file or directory No packages will be installed, upgraded, or removed. 0 packages upgraded, 0 newly installed, 0 to remove and 0 not upgraded. Need to get 0 B of archives. After unpacking 0 B will be used. ... I don't have this problem when i connect from an older ubuntu installation. This is output from my ubuntu 12.04 installation, LANG and LANGUAGE are set $ locale LANG=de_DE.UTF-8 LANGUAGE=de_DE:en_GB:en LC_CTYPE="de_DE.UTF-8" LC_NUMERIC=de_DE.UTF-8 LC_TIME=de_DE.UTF-8 LC_COLLATE="de_DE.UTF-8" LC_MONETARY=de_DE.UTF-8 LC_MESSAGES="de_DE.UTF-8" LC_PAPER=de_DE.UTF-8 LC_NAME=de_DE.UTF-8 LC_ADDRESS=de_DE.UTF-8 LC_TELEPHONE=de_DE.UTF-8 LC_MEASUREMENT=de_DE.UTF-8 LC_IDENTIFICATION=de_DE.UTF-8 LC_ALL= Does anybody know what has changed in ubuntu to get this error message on remote servers?

    Read the article

  • Are Java's public fields just a tragic historical design flaw at this point?

    - by Avi Flax
    It seems to be Java orthodoxy at this point that one should basically never use public fields for object state. (I don't necessarily agree, but that's not relevant to my question.) Given that, would it be right to say that from where we are today, it's clear that Java's public fields were a mistake/flaw of the language design? Or is there a rational argument that they're a useful and important part of the language, even today? Thanks! Update: I know about the more elegant approaches, such as in C#, Python, Groovy, etc. I'm not directly looking for those examples. I'm really just wondering if there's still someone deep in a bunker, muttering about how wonderful public fields really are, and how the masses are all just sheep, etc. Update 2: Clearly static final public fields are the standard way to create public constants. I was referring more to using public fields for object state (even immutable state). I'm thinking that it does seem like a design flaw that one should use public fields for constants, but not for state… a language's rules should be enforced naturally, by syntax, not by guidelines.

    Read the article

  • Classes as a compilation unit

    - by Yannbane
    If "compilation unit" is unclear, please refer to this. However, what I mean by it will be clear from the context. Edit: my language allows for multiple inheritance, unlike Java. I've started designing+developing my own programming language for educational, recreational, and potentially useful purposes. At first, I've decided to base it off Java. This implied that I would have all the code be written inside classes, and that code compiles to classes, which are loaded by the VM. However, I've excluded features such as interfaces and abstract classes, because I found no need for them. They seemed to be enforcing a paradigm, and I'd like my language not to do that. I wanted to keep the classes as the compilation unit though, because it seemed convenient to implement, familiar, and I just liked the idea. Then I noticed that I'm basically left with a glorified module system, where classes could be used either as "namespaces", providing constants and functions using the static directive, or as templates for objects that need to be instantiated ("actual" purpose of classes in other languages). Now I'm left wondering: what are the benefits of having classes as compilation units? (Also, any general commentary on my design would be much appreciated.)

    Read the article

  • Next in Concurrency

    - by Jatin
    For past year I have been working a lot on concurrency in Java and have build and worked on many concurrent packages. So in terms of development in the concurrent world, I am quite confident. Further I am very much interested to learn and understand more about concurrent programming. But I am unable to answer myself what next? What extra should I learn or work on to inherit more skills related to Multi-core processing. If there is any nice book (read and enjoyed 'concurrency in practice' and 'concurrent programming in java') or resource's related to Multi-core processing so that I can go to the next level?

    Read the article

  • css: Cross-browser, reflowing, top-to-bottom, multi-column lists

    - by Sai Emrys
    See http://cssfingerprint.com/about#stats. See also http://stackoverflow.com/questions/933645/multi-column-css-lists. I want a multi-column list that: uses no JS reflows on window size makes as many columns as fit the enclosing element therefore, does not require batching the list into manual column groups works in all browsers works for an arbitrary number of unknown-width (but single-line-height) elements makes each column fit the width of its (dynamic) contents does not create scrollbars or other overflow issues is sorted top to bottom where possible My code is currently: ul.multi, ol.multi { width: 100%; margin: 0; padding: 0; list-style: none; -moz-column-width: 12em; -webkit-column-width: 12em; column-width: 12em; -moz-column-gap: 1em; -webkit-column-gap: 1em; column-gap: 1em; } ul.multi li, ol.multi li { <!--[if IE]> float: left; <![endif]--> width: 20em; margin: 0; padding: 0; } Although this works okay, it has some problems: I have to guess the content width it is right-to-left in IE (though this is acceptable as a graceful degradation mode) it won't work at all in non-IE, non-Moz/Webkit/CSS3 browsers How can this be improved?

    Read the article

  • What is the recommended approach towards multi-tenant databases in MongoDB?

    - by Braintapper
    I'm thinking of creating a multi-tenant app using MongoDB. I don't have any guesses in terms of how many tenants I'd have yet, but I would like to be able to scale into the thousands. I can think of three strategies: All tenants in the same collection, using tenant-specific fields for security 1 Collection per tenant in a single shared DB 1 Database per tenant The voice in my head is suggesting that I go with option 2. Thoughts and implications, anyone?

    Read the article

  • Is Perl still a useful, viable language?

    - by Bob
    I know it may have been asked before, but here goes nothing... Is Perl still something that would be considered useful? If someone was a new programmer (either completely new to programming or just a few month/years of experience) would Perl be something to be considered worthwhile to learn? Is Perl still used with frequency? Is it still popular? Or is Perl dying out compared to languages like Python, Ruby, PHP, ASP, .NET, etc.? Basically it boils down to this: Is it still used/is it still used frequently? If yes, is it dying? If no, will it make a come back? Is it something that would be worth learning? How does it compare in demand to languages like Python in both popularity and usability/viability? Could languages like Python or Ruby be considered replacements for Perl? Also, will newer versions of Perl really bring a large improvement to the Perl community, and perhaps bring Perl back to centerstage compared to other languages? EDIT: Okay, I suppose here's a better, reworded question: Is Perl still growing, or is it "dying"? Is it still a language worth learning and using? What projects does it really "shine" in compared to other languages? What makes Perl a language to choose? Essentially: is Perl growing obsolete compared to other languages, and if so, do you expect that to change, or to continue? And thank you to everyone who has answered so far, the discussion has been really interesting!

    Read the article

  • So, "Are Design Patterns Missing Language Features"?

    - by Eduard Florinescu
    I saw the answer to this question: How does thinking on design patterns and OOP practices change in dynamic and weakly-typed languages? There it is a link to an article with an outspoken title: Are Design Patterns Missing Language Features. But where you can get snippets that seem very objective and factual and that can be verified from experience like: PaulGraham said "Peter Norvig found that 16 of the 23 patterns in Design Patterns were 'invisible or simpler' in Lisp." and a thing that confirms what I recently seen with people trying to simulate classes in javascript: Of course, nobody ever speaks of the "function" pattern, or the "class" pattern, or numerous other things that we take for granted because most languages provide them as built-in features. OTOH, programmers in a purely PrototypeOrientedLanguage? might well find it convenient to simulate classes with prototypes... I am taking into consideration also that design patterns are a communcation tool and because even with my limited experience participating in building applications I can see as an anti-pattern(ineffective and/or counterproductive) for example forcing a small PHP team to learn GoF patterns for small to medium intranet app, I am aware that scale, scope and purpose can determine what is effective and/or productive. I saw small commercial applications that mixed functional with OOP and still be maintainable, and I don't know if many would need for example in python to write a singleton but for me a simple module does the thing. patterns So are there studies or hands on experience shared that takes into consideration, all this, scale and scope of project, dynamics and size of the team, languages and technologies, so that you don't feel that a (difficult for some)design pattern is there just because there isn't a simpler way to do it or that it cannot be done by a language feature?

    Read the article

  • Syntax Highlighting for Gherkin (Cucumber Language)

    - by Liam McLennan
    SyntaxHighlighter is the de facto standard for syntax highlighting on the web. I am currently working on a tool for publishing BDD specifications on the web and I want syntax highlighting. Unfortunately, SyntaxHighlighter does not support Gherkin, the language Cucumber and SpecFlow use to define BDD specifications. Writing new language parsers for SyntaxHighlighter is very easy, so I implemented one for Gherkin. Here is what a syntax highlighted Gherkin file looks like: # A comment here Feature: Some terse yet descriptive text of what is desired In order to realize a named business value As a explicit system actor I want to gain some beneficial outcome which furthers the goal @secretlabel Scenario: Some determinable business situation Given some precondition And some other precondition When some action by the actor And some other action And yet another action Then some testable outcome is achieved And something else we can check happens too Like all SyntaxHighlighter brushes to use this one you need to install the brush (shBrushGherkin.js). I have also used a custom theme to get it just the way I wanted it (shThemeGherkin.css). If you would like to use my Gherkin brush you may download the code and example page.

    Read the article

  • Types of quotes for an HTML templating language

    - by Ralph
    I'm developing a templating language, and now I'm trying to decide on what I should do with quotes. I'm thinking about having 3 different types of quotes which are all handled differently: backtick ` double quote " single quote ' expand variables ? yes no escape sequences no yes ? escape html no yes yes Backticks Backticks are meant to be used for outputting JavaScript or unescaped HTML. It's often handy to be able to pass variables into JS, but it could also cause issues with things being treated as variables that shouldn't. My variables are PHP-style ($var) so I'm thinking that might mess with jQuery pretty bad... but if I disable variable expansion w/ backticks then, I'm not sure how would insert a variable into a JS code block? Single Quotes Not sure if escape sequences like \n should be treated as literals or converted. I find it pretty rare that I want to disable escape sequences, but if you do, you could use backticks. So I'm leaning towards "yes" for this one, but that would be contrary to how PHP does it. Double Quotes Pretty certain I want everything enabled for this one. Modifiers I'm also thinking about adding modifiers like @ or r in front of the string that would change some of these options to enable a few more combinations. I would need 9 different quotes or 3 quotes and 2 modifiers to get every combination wouldn't I? My language also supports "filters" which can be applied against any "term" (number, variable, string) so you could always write something like "blah blah $var blah"|expandvars Or "my string"|escapehtml Thoughts? What would you prefer? What would be least confusing/most intuitive?

    Read the article

  • Should I pick up a functional programming language?

    - by Statement
    I have recently been more concerned about the way I write my code. After reading a few books on design patterns (and overzealous implementation of them, I'm sure) I have shifted my thinking greatly toward encapsulating that which change. I tend to notice that I write less interfaces and more method-oriented code, where I love to spruce life into old classes with predicates, actions and other delegate tasks. I tend to think that it's often the actions that change, so I encapsulate those. I even often, although not always, break down interfaces to a single method, and then I prefer to use a delegate for the task instead of forcing client code to create a new class. So I guess it then hit me. Should I be doing functional programming instead? Edit: I may have a misconception about functional programming. Currently my language of choice is C#, and I come from a C++ background. I work as a game developer but I am currently unemployed. I have a great passion for architecture. My virtues are clean, flexible, reusable and maintainable code. I don't know if I have been poisoned by these ways or if it is for the better. Am I having a refactoring fever or should I move on? I understand this might be a question about "use the right tool for the job", but I'd like to hear your thoughts. Should I pick up a functional language? One of my fear factors is to leave the comfort of Visual Studio.

    Read the article

  • The Oldest Big Data Problem: Parsing Human Language

    - by dan.mcclary
    There's a new whitepaper up on Oracle Technology Network which details the use of Digital Reasoning Systems' Synthesys software on Oracle Big Data Appliance.  Digital Reasoning's approach is inherently "big data friendly," as it leverages multiple components of the Hadoop ecosystem.  Moreover, the paper addresses the oldest big data problem of them all: extracting knowledge from human text.   You can find the paper here.   From the Executive Summary: There is a wealth of information to be extracted from natural language, but that extraction is challenging. The volume of human language we generate constitutes a natural Big Data problem, while its complexity and nuance requires a particular expertise to model and mine. In this paper we illustrate the impressive combination of Oracle Big Data Appliance and Digital Reasoning Synthesys software. The combination of Synthesys and Big Data Appliance makes it possible to analyze tens of millions of documents in a matter of hours. Moreover, this powerful combination achieves four times greater throughput than conducting the equivalent analysis on a much larger cloud-deployed Hadoop cluster.

    Read the article

  • How to popularize Nemerle (or another programming language)?

    - by keykeeper
    Any .NET developer who is interested in different programming languages knows that F# is the most popular functional language for the .NET platform nowadays. The only fact describing the popularity of F# is the great support of Microsoft. But we are not limited with F# at all. There are some other functional languages on the .NET platform. I'm very disappointed with the fact that Nemerle isn't well-known. It's an awesome language which supports three paradigms: object-oriented, functional and meta- programming. I won't try to explain why I like it so much. The problem is that I can't use it at work. I think that only really brave companies can rely on Nemerle. It's almost unknown, that's why it's hard to find new developers for the project. Noone wants to make a first step with Nemerle if it can influence the budget what is reasonable. So, here is a question: what can I do to make Nemerle more popular? Here are my first ideas: implement open-source projects using Nemerle; make presentations on different conferences; write articles.

    Read the article

  • Mutating Programming Language?

    - by MattiasK
    For fun I was thinking about how one could build a programming language that differs from OOP and came up with this concept. I don't have a strong foundation in computer science so it might be common place without me knowing it (more likely it's just a stupid idea :) I apologize in advance for this somewhat rambling question :) Anyways here goes: In normal OOP methods and classes are variant only upon parameters, meaning if two different classes/methods call the same method they get the same output. My, perhaps crazy idea, is that the calling method and class could be an "invisible" part of it's signature and the response could vary depending on who call's an method. Say that we have a Window object with a Break() method, now anyone (who has access) could call this method on Window with the same result. Now say that we have two different objects, Hammer and SledgeHammer. If Break need to produce different results based on these we'd pass them as parameters Break(IBluntObject bluntObject) With a mutating programming language (mpl) the operating objects on the method would be visible to the Break Method without begin explicitly defined and it could adopt itself based on them). So if SledgeHammer calls Window.Break() it would generate vastly different results than if Hammer did so. If OOP classes are black boxes then MPL are black boxes that knows who's (trying) to push it's buttons and can adapt accordingly. You could also have different permission sets on methods depending who's calling them rather than having absolute permissions like public and private. Does this have any advantage over OOP? Or perhaps I should say, would it add anything to it since you should be able to simply add this aspect to methods (just give access to a CallingMethod and CallingClass variable in context) I'm not sure, might be to hard to wrap one's head around, it would be kinda interesting to have classes that adopted themselves to who uses them though. Still it's an interesting concept, what do you think, is it viable?

    Read the article

  • Is there a language more general than Lisp?

    - by Jon Purdy
    I've been programming for a long time, and writing in Lisp (well, mostly Scheme) for a little less. My experience in these languages (and other functional languages) has informed my ability to write clean code even with less powerful tools. Lisp-family languages have lovely facilities for implementing every abstraction in common use: S-expressions generalise structure. Macros generalise syntax. Continuations generalise flow control. But I'm dissatisfied. Somehow, I want more. Is there a language that's more general? More powerful? As great as Lisp is, I find it hard to believe no one has come up with anything (dare I say) better. I'm well aware that ordinarily a question like this ought to be closed for its argumentative nature. But there seems to be a broad consensus that Lisp represents the theoretical pinnacle of programming language design. I simply refuse to accept that without some kind of proof. Which I guess amounts to questioning whether the lambda calculus is in fact the ideal abstraction of computation.

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >