Search Results

Search found 3516 results on 141 pages for 'malloc history'.

Page 27/141 | < Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >

  • Who are the most important people in open-source software? [closed]

    - by poseid
    I am reading a book by Malcolm Gladwell on the circumstances of successful careers. The book argues that Bill Gates, Steve Jobs, Bill Joy and more succesful computer pioneers were born between 1950-1955, and did absolve around 10000 hours of practice before microcomputers became widely available in the 1970s and their fairy tale success story begins. As we are in the age of web 2.0 with new forms of databases and persuasive access to information, who are in your opinion the most succesful computer programmers or scientists of our times, when were they born and to which technologies they had access?

    Read the article

  • Are there old versions of Windows UX guidelines somewhere?

    - by Camilo Martin
    Since I've read Windows User Experience Interaction Guidelines (there's a PDF download avaliable) I've found it to be admirably self-deprecating, humbly pointing out their own horrible UI practices long scolded by Joel Spolsky. I'd like to know, however, what they had in mind while they made those mistakes. Is this (terrific) UX Guidelines document something new, or were there previous issues of such? If so, where can I find them? My prayers to Google yielded no leniency.

    Read the article

  • WordPerfect programmers refusing to use anything but assembler

    - by Totophil
    There is a version (popularised by Joel Spolsky) attributing the demise of WordPerfect to a refusal of its programmers to use anything but assembler that led to delay of the first WPwin release and as result eventually to losing the all important battle with Microsoft. There are a few references to programming work being done using assembler in the autobiographical book "Almost Perfect" by W. E. Pete Peterson who used to have a major influence at running the corporation. But these references go back to early 80's when WordPerfect was trying to gain a significant market share by defeating WordStar and not early nineties when the battle with MS took place. I am looking for a second independent source to confirm the assumption. Maybe someone who worked for WordPerfect Corporation at a time, who was close to the company, or had a chance to see the source could clarify the issue. Your help is much appreciated, thanks! Please note that this question is not about any other theories or reasons behind WordPerfect demise. I really just need to clarify whether they used assembler as a primary language for WPwin and (as a bonus really) whether there were discussions held within the corporation about assembler being the right choice. Concisely: Did WPCorp use assembler as a primary language for WPwin? Were discussions held at a time amongst WP Corp staff about assembler being the right choice (was it management or programmers decision)?

    Read the article

  • What are programming lost arts?

    - by pavpanchekha
    Have you ever programmed raw machine code (not for class)? Examined a hex dump with just a hex editor (or, heck, without)? Written your own software floating-point library? Division library? Written a non-school-assignment in Lisp or Forth? What sort of "lost arts" have been forgotten? And what reason (if any) would there be to resurrect them?

    Read the article

  • Significant new inventions in computing since 1980

    - by Alan Kay
    This question arose from comments about different kinds of progress in computing over the last 50 years or so. I was asked by some of the other participants to raise it as a question to the whole forum. Basic idea here is not to bash the current state of things but to try to understand something about the progress of coming up with fundamental new ideas and principles. I claim that we need really new ideas in most areas of computing, and I would like to know of any important and powerful ones that have been done recently. If we can't really find them, then we should ask "Why?" and "What should we be doing?"

    Read the article

  • Who in the software world do you admire the most?

    - by David McGraw
    In an effort to spark some discussion and to find interesting people that I didn't know about, is there anybody around the software industry that you really admire? Perhaps admire is the wrong choice of word, but I'm sure there is somebody out there that has impacted you in a minor way. What did you learn from this individual that defines what you try to achieve today?

    Read the article

  • Why do C compilers prepend underscores to external names?

    - by Michael Burr
    I've been working in C for so long that the fact that compilers typically add an underscore to the start of an extern is just understood... However, another SO question today got me wondering about the real reason why the underscore is added. A wikipedia article claims that a reason is: It was common practice for C compilers to prepend a leading underscore to all external scope program identifiers to avert clashes with contributions from runtime language support I think there's at least a kernel of truth to this, but also it seems to no really answer the question, since if the underscore is added to all externs it won't help much with preventing clashes. Does anyone have good information on the rationale for the leading underscore? Is the added underscore part of the reason that the Unix creat() system call doesn't end with an 'e'? I've heard that early linkers on some platforms had a limit of 6 characters for names. If that's the case, then prepending an underscore to external names would seem to be a downright crazy idea (now I only have 5 characters to play with...).

    Read the article

  • Why has anybody ever used COBOL?

    - by sarzl
    I know: You and me hate COBOL. I took a look at a lot of code examples and it didn't take me long to know why everybody tries to avoid it. So I really have no idea: Why was COBOL ever used? I mean: Hey - there was Fortran before it, and Fortran looks like a jesus-language compared to COBOL. This isn't argumentative but historical as I'm young and didn't even know about COBOL before 4 months.

    Read the article

  • Why is the software world full of status codes?

    - by David V McKay
    Why did programmers ever start using status codes? I mean, I guess I could imagine this might be useful back in the days when a text string was an expensive resource. WAYYY back then. But even after we had megabytes of memory to work with, we continued to use them. What possible advantage could there be for obfuscating the meaning of an error message or status message behind a status code?

    Read the article

  • Cobol: science and fiction

    - by user847
    There are a few threads about the relevance of the Cobol programming language on this forum, e.g. this thread links to a collection of them. What I am interested in here is a frequently repeated claim based on a study by Gartner from 1997: that there were around 200 billion lines of code in active use at that time! I would like to ask some questions to verify or falsify a couple of related points. My goal is to understand if this statement has any truth to it or if it is totally unrealistic. I apologize in advance for being a little verbose in presenting my line of thought and my own opinion on the things I am not sure about, but I think it might help to put things in context and thus highlight any wrong assumptions and conclusions I have made. Sometimes, the "200 billion lines" number is accompanied by the added claim that this corresponded to 80% of all programming code in any language in active use. Other times, the 80% merely refer to so-called "business code" (or some other vague phrase hinting that the reader is not to count mainstream software, embedded systems or anything else where Cobol is practically non-existent). In the following I assume that the code does not include double-counting of multiple installations of the same software (since that is cheating!). In particular in the time prior to the y2k problem, it has been noted that a lot of Cobol code is already 20 to 30 years old. That would mean it was written in the late 60ies and 70ies. At that time, the market leader was IBM with the IBM/370 mainframe. IBM has put up a historical announcement on his website quoting prices and availability. According to the sheet, prices are about one million dollars for machines with up to half a megabyte of memory. Question 1: How many mainframes have actually been sold? I have not found any numbers for those times; the latest numbers are for the year 2000, again by Gartner. :^( I would guess that the actual number is in the hundreds or the low thousands; if the market size was 50 billion in 2000 and the market has grown exponentially like any other technology, it might have been merely a few billions back in 1970. Since the IBM/370 was sold for twenty years, twenty times a few thousand will result in a couple of ten-thousands of machines (and that is pretty optimistic)! Question 2: How large were the programs in lines of code? I don't know how many bytes of machine code result from one line of source code on that architecture. But since the IBM/370 was a 32-bit machine, any address access must have used 4 bytes plus instruction (2, maybe 3 bytes for that?). If you count in operating system and data for the program, how many lines of code would have fit into the main memory of half a megabyte? Question 3: Was there no standard software? Did every single machine sold run a unique hand-coded system without any standard software? Seriously, even if every machine was programmed from scratch without any reuse of legacy code (wait ... didn't that violate one of the claims we started from to begin with???) we might have O(50,000 l.o.c./machine) * O(20,000 machines) = O(1,000,000,000 l.o.c.). That is still far, far, far away from 200 billion! Am I missing something obvious here? Question 4: How many programmers did we need to write 200 billion lines of code? I am really not sure about this one, but if we take an average of 10 l.o.c. per day, we would need 55 million man-years to achieve this! In the time-frame of 20 to 30 years this would mean that there must have existed two to three million programmers constantly writing, testing, debugging and documenting code. That would be about as many programmers as we have in China today, wouldn't it? Question 5: What about the competition? So far, I have come up with two things here: 1) IBM had their own programming language, PL/I. Above I have assumed that the majority of code has been written exclusively using Cobol. However, all other things being equal I wonder if IBM marketing had really pushed their own development off the market in favor of Cobol on their machines. Was there really no relevant code base of PL/I? 2) Sometimes (also on this board in the thread quoted above) I come across the claim that the "200 billion lines of code" are simply invisible to anybody outside of "governments, banks ..." (and whatnot). Actually, the DoD had funded their own language in order to increase cost effectiveness and reduce the proliferation of programming language. This lead to their use of Ada. Would they really worry about having so many different programming languages if they had predominantly used Cobol? If there was any language running on "government and military" systems outside the perception of mainstream computing, wouldn't that language be Ada? I hope someone can point out any flaws in my assumptions and/or conclusions and shed some light on whether the above claim has any truth to it or not.

    Read the article

  • (For what) Are Fortran, Cobol and Co. used today?

    - by lamas
    I'm a relatively young programmer and so I don't really know much about languages like Fortran or Cobol that have their origins in the beginning of modern informatics. I'm a bit confused because it seems like there are many people out there saying that these two languages are still very alive and being used all over the world whereas others say the opposite. In addition, it seems like there are only very few questions tagged Fortran or Cobol here on stackoverflow. Can someone "demystify" the situation for me? Who uses these senior languages these days and are they even used anymore? Do you have any experiences with one of the languages or do you know something about their latest developments?

    Read the article

  • is there a way to switch bash or zsh from emacs mode to vi mode with a keystroke

    - by Brandon
    I'd like to be able to switch temporarily from emacs mode to vi mode, since vi mode is sometimes better, but I'm usually half-way through typing something before I realize I want I don't want to switch permanently to vi mode, because I normally prefer emacs mode on the command line, mostly because it's what I'm used to, and over the years many of the keystrokes have become second nature. (As an editor I generally use emacs in viper mode, so that I can use both vi and emacs keystrokes, since I found myself accidentally using them in vi all the time, and screwing things up, and because in some cases I find vi keystrokes more memorable and handy, and in other cases emacs.)

    Read the article

  • Why is RAISERROR misspelled? Or is it not?

    - by Jason
    Why isn't RAISERROR spelled RAISEERROR? Where is the second E? I could understand if it were some ancient keyword length constraint, but I wouldn't expect it to be a nine-character limit. Is RAIS or RROR a technical word such that "raise-error" is just a mis-reading? Are its (immediate) origins in a different language? I've searched Google but not finding much on the subject.

    Read the article

  • What is the purpose of Java's unary plus operator?

    - by Syntactic
    Java's unary plus operator appears to have come over from C, via C++. As near as I can tell, it has the following effects: promotes its operand to int, if it's not already an int or wider unboxes its operand, if it's a wrapper object complicates slightly the parsing of evil expressions containing large numbers of consecutive plus signs It seems to me that there are better (or, at least, clearer) ways to do all of these things. In this SO question, concerning the counterpart operator in C#, someone said that "It's there to be overloaded if you feel the need." But in Java, one cannot overload any operator. So does this operator exist in Java just because it existed in C++?

    Read the article

  • What does IUrlHistoryStg::BindToObject Method do ?

    - by BHOdevelopper
    I'm looking for a way to access the address bar search so that i can append some personnal url at the end of the current list, and i found 'IUrlHistoryStg::BindToObject' but there is no documention linked to it. Anyone knows what this method does ? On msdn: http://msdn.microsoft.com/en-us/library/aa767718%28VS.85%29.aspx

    Read the article

  • Is there any kind of standard for 8086 multiprocessing?

    - by Earlz
    Back when I made an 8086 emulator I noticed that there was the LOCK prefix intended for synchonization in a multiprocessor environment. Yet the only multitasking I know of for the x86 arch. involves use of the APIC which didn't come around until either the Pentiums or 486s. Was there any kind of standard for 8086 multitasking or was it done by some manufacturer specific extensions to the instruction set and/or special ports? By standard, I mean things like: How do you separate the 2 processors if they both use the same memory? This is impossible without some kind of way to make each processor execute a different piece of code. (or cause an interrupt on only one processor)

    Read the article

  • What inspired WPF?

    - by Andrei Rinea
    I was told by someon that, just as .NET started inspired from Java, WPF was inspired by a similar technology, as "Microsoft never innovates". However, I can't find anything remotely close to WPF. What particular technology did or could have inspired Microsoft to write WPF?

    Read the article

  • What was the single byte change to port WordStar from CP/M to DOS?

    - by amarillion
    I was re-reading Joel's Strategy Letter II: Chicken and Egg problems and came across this fun quote: In fact, WordStar was ported to DOS by changing one single byte in the code. (Real Programmers can tell you what that byte was, I've long since forgotten). I couldn't find any other references to this with a quick Google search. Is this true or just a figure of speech? In the interest of my quest to become a "Real Programmer", what was the single byte change?

    Read the article

  • Why does C's "fopen" take a "const char *" as its second argument?

    - by Chris Cooper
    It has always struck me as strange that the C function "fopen" takes a "const char *" as the second argument. I would think it would be easier to both read your code and implement the library's code if there were bit masks defined in stdio.h, like "IO_READ" and such, so you could do things like: FILE* myFile = fopen("file.txt", IO_READ & IO_WRITE); Is there a programmatic reason for the way it actually is, or is it just historic? (i.e. "That's just the way it is.")

    Read the article

  • Where are the new ideas in programming languages?

    - by 0xF
    I've recently been looking into the topic of programming languages and from what I've seen, few to none serious languages try making really "new" things that were not seen before their creation. Why do all more or less successful programming languages since 1980 or so just combine aspects of their predecessors? I just can't believe that programming languages "can't get any better"..

    Read the article

  • What does the 'X' in .aspx, docx, xlsx, etc... represent?

    - by Serapth
    It's one of those things you just take for granted until one day someone asks you and you realize you can't answer it. Much like for years I never questioned the use of 1033 directories in Microsoft products for years until one day, someone asked me about it. Around the release of .NET and Office 2007, Microsoft added an x to basically all of their extensions and I frankly took it as representing XML, but that simply doesn't make sense with .aspx. So, I realize this is a very non technical question, but now that the question has been asked of me and my googling hasn't given me an answer, can anyone tell me with authority what the X represents? Is it extended? Xml? Or is there no meaning behind it?

    Read the article

< Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >