Search Results

Search found 3516 results on 141 pages for 'malloc history'.

Page 19/141 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • How long was Microsoft working on .NET before they released it?

    - by Richard DesLonde
    With the whole CLI, CTS, CLS, etc., not only did they release a powerful platform/infrastructure, but they released all the specs that describe it etc. It supports potentially infinite myriad languages, platforms, etc. This seems like an insane amount of work, even for a behemoth like Microsoft - especially since it turns out they did a damn good job. How long were they working on this before releasing it (.NET 1.0)?

    Read the article

  • Why isn't DSM for unstructured memory done today?

    - by sinned
    Ages ago, Djikstra invented IPC through mutexes which then somehow led to shared memory (SHM) in multics (which afaik had the necessary mmap first). Then computer networks came up and DSM (distributed SHM) was invented for IPC between computers. So DSM is basically a not prestructured memory region (like a SHM) that magically get's synchronized between computers without the applications programmer taking action. Implementations include Treadmarks (inofficially dead now) and CRL. But then someone thought this is not the right way to do it and invented Linda & tuplespaces. Current implementations include JavaSpaces and GigaSpaces. Here, you have to structure your data into tuples. Other ways to achieve similar effects may be the use of a relational database or a key-value-store like RIAK. Although someone might argue, I don't consider them as DSM since there is no coherent memory region where you can put data structures in as you like but have to structure your data which can be hard if it is continuous and administration like locking can not be done for hard coded parts (=tuples, ...). Why is there no DSM implementation today or am I just unable to find one?

    Read the article

  • Google Closure Compiler - what does the name mean?

    - by mikez302
    I am curious about the Google Closure Compiler. Why did they name it that? Does it have anything to do with lexical closures? EDIT: I tried researching it in the FAQ and documentation, as well as doing Google searches such as "closure compiler name". I couldn't find anything definite, hence the reason I am asking. I don't think I will get a profoundly helpful answer but I was hoping that I could at least satisfy my curiosity. I am not trying to solve a specific problem. I am just curious.

    Read the article

  • Why is filesystem preferred for logs instead of RDBMS?

    - by Yasir
    Question should be clear from its title. For example Apache saves its access and error logs in files instead of RDBMS no matter on how large or small scale it is being utilized. For RDMS we just have to write SQL queries and it will do the work while for files we must decide a particular format and then write regex or may be parsers to manipulate them. And those might even fail in particular circumstances if great care was not paid. Yet everyone seems to prefer filesystem for maintaining the logs. I am not biased against any of these methods but I would like to know why it is practiced like this. Is it speed or maintainability or something else?

    Read the article

  • 40 Vintage Computer Ads of Yesteryear [Image Collection]

    - by Asian Angel
    Earlier this week we shared an awesome retro ad for a 10 MB hard-drive with you and today we are back with more classic ad goodness. Travel into the past with these forty vintage computer ads from yesteryear! Special thanks to ETC reader George for sharing this awesome link with us! 40 Vintage Computer Ads of Yesteryears [HongKiat] Can Dust Actually Damage My Computer? What To Do If You Get a Virus on Your Computer Why Enabling “Do Not Track” Doesn’t Stop You From Being Tracked

    Read the article

  • What triggered the popularity of lambda functions in modern mainstream programming languages?

    - by Giorgio
    In the last few years anonymous functions (AKA lambda functions) have become a very popular language construct and almost every major / mainstream programming language has introduced them or is planned to introduce them in an upcoming revision of the standard. Yet, anonymous functions are a very old and very well-known concept in Mathematics and Computer Science (invented by the mathematician Alonzo Church around 1936, and used by the Lisp programming language since 1958, see e.g. here). So why didn't today's mainstream programming languages (many of which originated 15 to 20 years ago) support lambda functions from the very beginning and only introduced them later? And what triggered the massive adoption of anonymous functions in the last few years? Is there some specific event, new requirement or programming technique that started this phenomenon? IMPORTANT NOTE The focus of this question is the introduction of anonymous functions in modern, main-stream (and therefore, maybe with a few exceptions, non functional) languages. Also, note that anonymous functions (blocks) are present in Smalltalk, which is not a functional language, and that normal named functions have been present even in procedural languages like C and Pascal for a long time. Please do not overgeneralize your answers by speaking about "the adoption of the functional paradigm and its benefits", because this is not the topic of the question.

    Read the article

  • Why is the sudden increase in number of Git submitters on Debian popcorn graph in 2010-01?

    - by Jungle Hunter
    Almost every article I've read 1 comparing Git and Mercurial it seems like Mercurial has a better command line UX with each command being limited to one idea only (unlike say git checkout). But at some point Git suddenly became looking super popular and number of Git submitters on Debian popcorn graph (see graph image below) literally exploded. Source: Debian What happened in 2010-01 that things suddenly changed. Looks like GitHub was founded earlier than that - 2008.

    Read the article

  • First languages with generic programming support

    - by oluies
    Which was the first language with generic programming support, and what was the first major staticly typed language (widely used) with generics support. Generics implement the concept of parameterized types to allow for multiple types. The term generic means "pertaining to or appropriate to large groups of classes." I have seen the following mentions of "first": First-order parametric polymorphism is now a standard element of statically typed programming languages. Starting with System F [20,42] and functional programming lan- guages, the constructs have found their way into mainstream languages such as Java and C#. In these languages, first-order parametric polymorphism is usually called generics. From "Generics of a Higher Kind", Adriaan Moors, Frank Piessens, and Martin Odersky Generic programming is a style of computer programming in which algorithms are written in terms of to-be-specified-later types that are then instantiated when needed for specific types provided as parameters. This approach, pioneered by Ada in 1983 From Wikipedia Generic Programming

    Read the article

  • The Future According to Films [Infographic]

    - by Jason Fitzpatrick
    Curious what the future will look like? According to movie directors, casting their lens towards the future of humanity, it’s quite a mixed bag. Check out this infographic timeline to check out the next 300,000 years of human evolution. A quick glance over the timeline shows a series of future where things can quickly go from the fun times to the end-of-the-world times. We’d like to, for example, live it up in the Futurama future of 3000 AD and not the Earth-gets-destroyed future of Titan A.E’s 3028. Hit up the link below for a high-res copy of the infographic. The Future According to Films [Tremulant Design via Geeks Are Sexy] HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS Use Your Android Phone to Comparison Shop: 4 Scanner Apps Reviewed How to Run Android Apps on Your Desktop the Easy Way

    Read the article

  • Virtual Newsstand Displays Comic Books by Date

    - by Jason Fitzpatrick
    If you’re a comic book aficionado (or just want to take a stroll down memory lane), this virtual newsstand shows you all the comics published for any month and year going all the way back to the 1930s. Courtesy of Mike’s Amazing World of Comics, the virtual newsstand lets you dial in a month, year, sorting style, and shows all publishers or select publishers. The covers are displayed in a grid where you can click through to see a larger version of the cover and read additional information about the comic. It’s a really neat way to check out trends in comic design and artwork over the years. Hit up the link below to take it for the spin. Have a cool comic book resource to share? Sound off in the comments. The Newsstand [via Boing Boing] Why Enabling “Do Not Track” Doesn’t Stop You From Being Tracked HTG Explains: What is the Windows Page File and Should You Disable It? How To Get a Better Wireless Signal and Reduce Wireless Network Interference

    Read the article

  • Examples of "Lost art" on software technology/development

    - by mamcx
    With the advent of a new technology, some old ideas - despite been good - are forgotten in the process. I read a lot how some "new" thing was already present in Lisp like 60 years ago, but only recently resurface with other name or on top of another language. Now look like the new old thing is build functional, non-mutable, non-locking-thread stuff... and that make me wonder what have been "lost" in the art of development of software? What ideas are almost forgotten, waiting for resurface? One of my, I remember when I code in foxpro. The idea of have a full stack to develop database apps without impedance mismatch is something I truly miss. In the current languages, I never find another environment that match how easy was develop in fox back them. What else is missing?

    Read the article

  • Is MUMPS alive?

    - by ern0
    At my first workplace we were using Digital Standard MUMPS on a PDP 11-clone (TPA 440), then we've switched to Micronetics Standard MUMPS running on a Hewlett-Packard machine, HP-UX 9, around early 90's. Is still MUMPS alive? Are there anyone using it? If yes, please write some words about it: are you using it in character mode, does it acts as web server? etc. (I mean Caché, too.) If you've been used it, what was your feelings about it? Did you liked it?

    Read the article

  • What significant lost advances on side tracks should be revived in the main stream of software?

    - by C.W.Holeman II
    In reading Alan Kay's question on Significant new inventions I was coming up with answers that were not new ideas but old ones that have been passed by in the main stream. So, what significant lost advances that happened on a side track should be revived in the main stream of software? These would be ideas that worked well in their context but a different context appeared and became dominant the context, even though it lacked the idea. It maybe that in the new dominant context one just cannot do something or it requires great effort to be exerted.

    Read the article

  • First ATMs programming language

    - by revo
    First ATMs performed tasks like a cash dispenser, they were offline machines which worked with punch cards impregnated with Carbon and a 6-digit PIN code. Maximum withdrawal with a card was 10 pounds and each one was a one-time use card - ATM swallowed cards! The first ATM was installed in London in the year 1967, as I looked at time line of programming languages, there were many programming languages made before that decade. I don't know about the hardware neither, but in which programming language it was written? *I didn't find a detailed biography of John Shepherd-Barron (ATM inventor at 70s) Update I found this picture, which is taken from a newspaper back to the year 1972 in Iran. Translated PS : Shows Mr. Rad-lon (if spelled correctly), The manager of Barros (if spelled correctly) International Educational Institute in United Kingdom at the right, and Mr. Jim Sutherland - Expert of Computer Kiosks. In the rest of the text I found on this paper, these kind of ATMs which called "Automated Computer Kiosk" were advertised with this: Mr. Rad-lon (if spelled correctly) puts his card to one specific location of Automated Computer Kiosk and after 10 seconds he withdraws his cash. Two more questions are: 1- How those ATMs were so fast? (withdrawal in 10 seconds in that year) 2- I didn't find any text on Internet which state about "Automated Computer Kiosk", Is it valid or were they being called Computer in that time?

    Read the article

  • Clear Linux file system after shutdown / start

    - by user35443
    I have very specific task. I need to clear the desktop, downloads, documents and so on after every shutdown or finish. For example, if anyone downloads something using Google Chrome, he will work with it and then he'll shutdown the computer for next use. And when second user sits for working on the computer, he'll find a clear file system without the data downloaded by the first user. On Windows, I used to work with Returnil Virtual System, but it doesn't have support for Linux. Can anybody tell me if is it possible and, if so, how? I was also thinking of using Wine for this program, but don't think it will be the best idea.

    Read the article

  • Origin of common list-processing function names

    - by Heatsink
    Some higher-order functions for operating on lists or arrays have been repeatedly adopted or reinvented. The functions map, fold[l|r], and filter are found together in several programming languages, such as Scheme, ML, and Python, that don't seem to have a common ancestor. I'm going with these three names to keep the question focused. To show that the names are not universal, here is a sampling of names for equivalent functionality in other languages. C++ has transform instead of map and remove_if instead of filter (reversing the meaning of the predicate). Lisp has mapcar instead of map, remove-if-not instead of filter, and reduce instead of fold (Some modern Lisp variants have map but this appears to be a derived form.) C# uses Select instead of map and Where instead of filter. C#'s names came from SQL via LINQ, and despite the name changes, their functionality was influenced by Haskell, which was itself influenced by ML. The names map, fold, and filter are widespread, but not universal. This suggests that they were borrowed from an influential source into other contemporary languages. Where did these function names come from?

    Read the article

  • Why did the web win the space of remote applications and X not?

    - by Martin Josefsson
    The X Window System is 25 years old, it had it's birthday yesterday (on the 15'th). As you probably are aware of, one of it's most important features is the separation of the server side and the client side in a way that neither Microsoft's, Apples or Wayland's windowing systems have. Back in the days (sorry for the ambiguous phrasing) many believed X would dominate over other ways to make windows because of this separation of server and client, allowing the application to be ran on a server somewhere else while the user clicks and types on her own computer at home. This use obviously still exists, but is marginalized at best. When we write and use programs that run on a server we almost always use the web with it's html/css/js. Why did the web win, and X not? The technologies used for the web (said html/css/js) are a mess. Combined with all the back-end-frameworks (Rails, Django and all) it really is a jungle to navigate thru. Still the web thrives with creativity and progress, while remote X apps do not.

    Read the article

  • What significant advances lost on side tracks should be revived in the main stream of software?

    - by C.W.Holeman II
    In reading Alan Kay's question on Significant new inventions I was coming up with answers that were not new ideas but old ones that have been passed by in the main stream. So, what significant lost advances that happened on a side track should be revived in the main stream of software? These would be ideas that worked well in their context but a different context appeared and became dominant the context, even though it lacked the idea. It maybe that in the new dominant context one just cannot do something or it requires great effort to be exerted.

    Read the article

  • Earliest use of Comments as Semantically Meaningful Things in a Program?

    - by Alan Storm
    In certain corners of the PHP meta-programming world, it's become fashionable to use PHPDoc comments as a mechanism for providing semantically meaningful information to a program. That is, other code will parse the doc blocks and do something significant with the information encoded in those comments. Doctrine's annotations and code generation are an example of this. What's the earliest (or some early) use of this technique? I have vague memories of some early java Design by Contract implementations doing similar things, but I'm not sure of those folks were inventing the technique, or if they got it from somewhere. Mainly asking so I can provide some historical context for PHP developers who haven't come across the technique before, and are distrustful of it because it seems a little crazy pants.

    Read the article

  • Are Java's public fields just a tragic historical design flaw at this point?

    - by Avi Flax
    It seems to be Java orthodoxy at this point that one should basically never use public fields for object state. (I don't necessarily agree, but that's not relevant to my question.) Given that, would it be right to say that from where we are today, it's clear that Java's public fields were a mistake/flaw of the language design? Or is there a rational argument that they're a useful and important part of the language, even today? Thanks! Update: I know about the more elegant approaches, such as in C#, Python, Groovy, etc. I'm not directly looking for those examples. I'm really just wondering if there's still someone deep in a bunker, muttering about how wonderful public fields really are, and how the masses are all just sheep, etc. Update 2: Clearly static final public fields are the standard way to create public constants. I was referring more to using public fields for object state (even immutable state). I'm thinking that it does seem like a design flaw that one should use public fields for constants, but not for state… a language's rules should be enforced naturally, by syntax, not by guidelines.

    Read the article

  • Ubuntu Philosophy Question

    - by The-Ever-Kid
    I was just switching from Windows to Ubuntu and I started to read this. And there were quite a few things I did not understand one was : "OpenOffice decided not to have a learning curve" And "Firefox tries very hard to make sure pages written in 1995 look like they did in 1995. " And Finally "Windows isn't a poor man's Linux." In the final statement shouldn't the statement be the opposite.

    Read the article

  • What was the first programming language written for computers?

    - by ThePlan
    Looking at so many programming languages we have today, each one being unique in it's own way, I've tried to figure out what the first programming language written for computers is. Looking at the release date for the popular ones I got somewhat close but I didn't look at less obvious ones, programming languages which are either dead or very little use nowadays. Fortran is the closest thing I got but I don't know if it's real. In a nutshell: What was the first programming language written for computers? Are there any languages that derived from that language?

    Read the article

  • Why can't `main` return a double or String rather than int or void?

    - by sunny
    In many languages such as C, C++, and Java, the main method/function has a return type of void or int, but not double or String. What might be the reasons behind that? I know a little bit that we can't do that because main is called by runtime library and it expects some syntax like int main() or int main(int,char**) so we have to stick to that. So my question is: why does main have the type signature that it has, and not a different one?

    Read the article

  • Book Readers As Envisioned Circa 1935

    - by Jason Fitzpatrick
    This early 20th century sketch showcases the future of books; thankfully the actual delivery of the concept proved to be a bit more lap-friendly. The sketch is from the April, 1935 issues of Everyday Science and Mechanics and presents a vision of book consumption that, thankfully, came to pass in a much more compact fashion that doesn’t require swapping rolls of film. [via Boing Boing] How To Properly Scan a Photograph (And Get An Even Better Image) The HTG Guide to Hiding Your Data in a TrueCrypt Hidden Volume Make Your Own Windows 8 Start Button with Zero Memory Usage

    Read the article

  • What significant lost steps forward in side tracks should be revived in the main stream of software?

    - by C.W.Holeman II
    In reading Alan Kay's question on Significant new inventions I was coming up with answers that were not new ideas but old ones that have been passed by in the main stream. So, what significant lost steps forward that happened in a side track should be revived in the main stream of software? These would be ideas that worked well in their context but a different context appeared and became dominant the context, even though it lacked the idea. It maybe that in the new dominant context one just cannot do something or it requires great effort to be exerted.

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >