Search Results

Search found 3761 results on 151 pages for 'revision history'.

Page 26/151 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • What happened to Perl?

    - by llasa
    I will try to keep this as objective as possible. I've been dealing with PHP since 3 years know, I have always known of Perl but never really "dived" into it. So I took a look at some Perl code examples and I thought: Wow, It's like PHP just failed at cloning it. My questions are: What is bad about Perl? What are the disadvantages that made it so extremely unpopular so that it is actually dying right know? Why could PHP take over? What does PHP have (or what did it have in the times of PHP4) that made it rise in popularity compared to Perl? I'm rather young and the questions above are a bit subjective and I think you can only really answer them when you have experienced the rise of PHP along with the fall of Perl. Unless my question before I hope that this one here can be more or less completely answered. There have to be definite disadvantages Perl has compared to PHP that made it fall.

    Read the article

  • CSS/JavaScript/hacking: Detect :visited styling on a link *without* checking it directly OR do it fa

    - by Sai Emrys
    This is for research purposes on http://cssfingerprint.com Consider the following code: <style> div.csshistory a { display: none; color: #00ff00;} div.csshistory a:visited { display: inline; color: #ff0000;} </style> <div id="batch" class="csshistory"> <a id="1" href="http://foo.com">anything you want here</a> <a id="2" href="http://bar.com">anything you want here</a> [etc * ~2000] </div> My goal is to detect whether foo has been rendered using the :visited styling. I want to detect whether foo.com is visited without directly looking at $('1').getComputedStyle (or in Internet Explorer, currentStyle), or any other direct method on that element. The purpose of this is to get around a potential browser restriction that would prevent direct inspection of the style of visited links. For instance, maybe you can put a sub-element in the <a> tag, or check the styling of the text directly; etc. Any method that does not directly or indierctly rely on $('1').anything is acceptable. Doing something clever with the child or parent is probably necessary. Note that for the purposes of this point only, the scenario is that the browser will lie to JavaScript about all properties of the <a> element (but not others), and that it will only render color: in :visited. Therefore, methods that rely on e.g. text size or background-image will not meet this requirement. I want to improve the speed of my current scraping methods. The majority of time (at least with the jQuery method in Firefox) is spent on document.body.appendChild(batch), so finding a way to improve that call would probably most effective. See http://cssfingerprint.com/about and http://cssfingerprint.com/results for current speed test results. The methods I am currently using can be seen at http://github.com/saizai/cssfingerprint/blob/master/public/javascripts/history_scrape.js To summarize for tl;dr, they are: set color or display on :visited per above, and check each one directly w/ getComputedStyle put the ID of the link (plus a space) inside the <a> tag, and using jQuery's :visible selector, extract only the visible text (= the visited link IDs) FWIW, I'm a white hat, and I'm doing this in consultation with the EFF and some other fairly well known security researchers. If you contribute a new method or speedup, you'll get thanked at http://cssfingerprint.com/about (if you want to be :-P), and potentially in a future published paper. ETA: The bounty will be rewarded only for suggestions that can, on Firefox, avoid the hypothetical restriction described in point 1 above, or perform at least 10% faster, on any browser for which I have sufficient current data, than my best performing methods listed in the graph at http://cssfingerprint.com/about In case more than one suggestion fits either criterion, the one that does best wins.

    Read the article

  • when was Kase born?

    - by Horace Ho
    First time I saw a class Kase, I was scratching my head. My guess it's something to do with a conflict of the keyboard case. BTW, since when, for which language(S), it becomes a norm?

    Read the article

  • What ever happened to APL?

    - by lkessler
    When I was at University 30 years ago, I used a programming language called APL. I believe the acronym stood for "A Programming Language", This language was interpretive and was especially useful for array and matrix operations with powerful operators and library functions to help with that. Did you use APL? Is this language still in use anywhere? Is it still available, either commercially or open source? I remember the combinatorics assignment we had. It was complex. It took a week of work for people to program it in PL/1 and those programs ranged from 500 to 1000 lines long. I wrote it in APL in under an hour. I left it at 10 lines for readability, although I should have been a purist and worked another hour to get it into 1 line. The PL/1 programs took 1 or 2 minutes to run on the IBM mainframe and solve the problem. The computer charge was $20. My APL program took 2 hours to run and the charge was $1,500 which was paid for by our Computer Science Department's budget. That's when I realized that a week of my time is worth way more than saving some $'s in someone else's budget. I got an A+ in the course. p.s. Don't miss this presentation entitled: "APL one of the greatest programming languages ever"

    Read the article

  • Who are the most important people in open-source software? [closed]

    - by poseid
    I am reading a book by Malcolm Gladwell on the circumstances of successful careers. The book argues that Bill Gates, Steve Jobs, Bill Joy and more succesful computer pioneers were born between 1950-1955, and did absolve around 10000 hours of practice before microcomputers became widely available in the 1970s and their fairy tale success story begins. As we are in the age of web 2.0 with new forms of databases and persuasive access to information, who are in your opinion the most succesful computer programmers or scientists of our times, when were they born and to which technologies they had access?

    Read the article

  • Are there old versions of Windows UX guidelines somewhere?

    - by Camilo Martin
    Since I've read Windows User Experience Interaction Guidelines (there's a PDF download avaliable) I've found it to be admirably self-deprecating, humbly pointing out their own horrible UI practices long scolded by Joel Spolsky. I'd like to know, however, what they had in mind while they made those mistakes. Is this (terrific) UX Guidelines document something new, or were there previous issues of such? If so, where can I find them? My prayers to Google yielded no leniency.

    Read the article

  • WordPerfect programmers refusing to use anything but assembler

    - by Totophil
    There is a version (popularised by Joel Spolsky) attributing the demise of WordPerfect to a refusal of its programmers to use anything but assembler that led to delay of the first WPwin release and as result eventually to losing the all important battle with Microsoft. There are a few references to programming work being done using assembler in the autobiographical book "Almost Perfect" by W. E. Pete Peterson who used to have a major influence at running the corporation. But these references go back to early 80's when WordPerfect was trying to gain a significant market share by defeating WordStar and not early nineties when the battle with MS took place. I am looking for a second independent source to confirm the assumption. Maybe someone who worked for WordPerfect Corporation at a time, who was close to the company, or had a chance to see the source could clarify the issue. Your help is much appreciated, thanks! Please note that this question is not about any other theories or reasons behind WordPerfect demise. I really just need to clarify whether they used assembler as a primary language for WPwin and (as a bonus really) whether there were discussions held within the corporation about assembler being the right choice. Concisely: Did WPCorp use assembler as a primary language for WPwin? Were discussions held at a time amongst WP Corp staff about assembler being the right choice (was it management or programmers decision)?

    Read the article

  • What are programming lost arts?

    - by pavpanchekha
    Have you ever programmed raw machine code (not for class)? Examined a hex dump with just a hex editor (or, heck, without)? Written your own software floating-point library? Division library? Written a non-school-assignment in Lisp or Forth? What sort of "lost arts" have been forgotten? And what reason (if any) would there be to resurrect them?

    Read the article

  • Significant new inventions in computing since 1980

    - by Alan Kay
    This question arose from comments about different kinds of progress in computing over the last 50 years or so. I was asked by some of the other participants to raise it as a question to the whole forum. Basic idea here is not to bash the current state of things but to try to understand something about the progress of coming up with fundamental new ideas and principles. I claim that we need really new ideas in most areas of computing, and I would like to know of any important and powerful ones that have been done recently. If we can't really find them, then we should ask "Why?" and "What should we be doing?"

    Read the article

  • Who in the software world do you admire the most?

    - by David McGraw
    In an effort to spark some discussion and to find interesting people that I didn't know about, is there anybody around the software industry that you really admire? Perhaps admire is the wrong choice of word, but I'm sure there is somebody out there that has impacted you in a minor way. What did you learn from this individual that defines what you try to achieve today?

    Read the article

  • Why do C compilers prepend underscores to external names?

    - by Michael Burr
    I've been working in C for so long that the fact that compilers typically add an underscore to the start of an extern is just understood... However, another SO question today got me wondering about the real reason why the underscore is added. A wikipedia article claims that a reason is: It was common practice for C compilers to prepend a leading underscore to all external scope program identifiers to avert clashes with contributions from runtime language support I think there's at least a kernel of truth to this, but also it seems to no really answer the question, since if the underscore is added to all externs it won't help much with preventing clashes. Does anyone have good information on the rationale for the leading underscore? Is the added underscore part of the reason that the Unix creat() system call doesn't end with an 'e'? I've heard that early linkers on some platforms had a limit of 6 characters for names. If that's the case, then prepending an underscore to external names would seem to be a downright crazy idea (now I only have 5 characters to play with...).

    Read the article

  • Why has anybody ever used COBOL?

    - by sarzl
    I know: You and me hate COBOL. I took a look at a lot of code examples and it didn't take me long to know why everybody tries to avoid it. So I really have no idea: Why was COBOL ever used? I mean: Hey - there was Fortran before it, and Fortran looks like a jesus-language compared to COBOL. This isn't argumentative but historical as I'm young and didn't even know about COBOL before 4 months.

    Read the article

  • Why is the software world full of status codes?

    - by David V McKay
    Why did programmers ever start using status codes? I mean, I guess I could imagine this might be useful back in the days when a text string was an expensive resource. WAYYY back then. But even after we had megabytes of memory to work with, we continued to use them. What possible advantage could there be for obfuscating the meaning of an error message or status message behind a status code?

    Read the article

  • Cobol: science and fiction

    - by user847
    There are a few threads about the relevance of the Cobol programming language on this forum, e.g. this thread links to a collection of them. What I am interested in here is a frequently repeated claim based on a study by Gartner from 1997: that there were around 200 billion lines of code in active use at that time! I would like to ask some questions to verify or falsify a couple of related points. My goal is to understand if this statement has any truth to it or if it is totally unrealistic. I apologize in advance for being a little verbose in presenting my line of thought and my own opinion on the things I am not sure about, but I think it might help to put things in context and thus highlight any wrong assumptions and conclusions I have made. Sometimes, the "200 billion lines" number is accompanied by the added claim that this corresponded to 80% of all programming code in any language in active use. Other times, the 80% merely refer to so-called "business code" (or some other vague phrase hinting that the reader is not to count mainstream software, embedded systems or anything else where Cobol is practically non-existent). In the following I assume that the code does not include double-counting of multiple installations of the same software (since that is cheating!). In particular in the time prior to the y2k problem, it has been noted that a lot of Cobol code is already 20 to 30 years old. That would mean it was written in the late 60ies and 70ies. At that time, the market leader was IBM with the IBM/370 mainframe. IBM has put up a historical announcement on his website quoting prices and availability. According to the sheet, prices are about one million dollars for machines with up to half a megabyte of memory. Question 1: How many mainframes have actually been sold? I have not found any numbers for those times; the latest numbers are for the year 2000, again by Gartner. :^( I would guess that the actual number is in the hundreds or the low thousands; if the market size was 50 billion in 2000 and the market has grown exponentially like any other technology, it might have been merely a few billions back in 1970. Since the IBM/370 was sold for twenty years, twenty times a few thousand will result in a couple of ten-thousands of machines (and that is pretty optimistic)! Question 2: How large were the programs in lines of code? I don't know how many bytes of machine code result from one line of source code on that architecture. But since the IBM/370 was a 32-bit machine, any address access must have used 4 bytes plus instruction (2, maybe 3 bytes for that?). If you count in operating system and data for the program, how many lines of code would have fit into the main memory of half a megabyte? Question 3: Was there no standard software? Did every single machine sold run a unique hand-coded system without any standard software? Seriously, even if every machine was programmed from scratch without any reuse of legacy code (wait ... didn't that violate one of the claims we started from to begin with???) we might have O(50,000 l.o.c./machine) * O(20,000 machines) = O(1,000,000,000 l.o.c.). That is still far, far, far away from 200 billion! Am I missing something obvious here? Question 4: How many programmers did we need to write 200 billion lines of code? I am really not sure about this one, but if we take an average of 10 l.o.c. per day, we would need 55 million man-years to achieve this! In the time-frame of 20 to 30 years this would mean that there must have existed two to three million programmers constantly writing, testing, debugging and documenting code. That would be about as many programmers as we have in China today, wouldn't it? Question 5: What about the competition? So far, I have come up with two things here: 1) IBM had their own programming language, PL/I. Above I have assumed that the majority of code has been written exclusively using Cobol. However, all other things being equal I wonder if IBM marketing had really pushed their own development off the market in favor of Cobol on their machines. Was there really no relevant code base of PL/I? 2) Sometimes (also on this board in the thread quoted above) I come across the claim that the "200 billion lines of code" are simply invisible to anybody outside of "governments, banks ..." (and whatnot). Actually, the DoD had funded their own language in order to increase cost effectiveness and reduce the proliferation of programming language. This lead to their use of Ada. Would they really worry about having so many different programming languages if they had predominantly used Cobol? If there was any language running on "government and military" systems outside the perception of mainstream computing, wouldn't that language be Ada? I hope someone can point out any flaws in my assumptions and/or conclusions and shed some light on whether the above claim has any truth to it or not.

    Read the article

  • (For what) Are Fortran, Cobol and Co. used today?

    - by lamas
    I'm a relatively young programmer and so I don't really know much about languages like Fortran or Cobol that have their origins in the beginning of modern informatics. I'm a bit confused because it seems like there are many people out there saying that these two languages are still very alive and being used all over the world whereas others say the opposite. In addition, it seems like there are only very few questions tagged Fortran or Cobol here on stackoverflow. Can someone "demystify" the situation for me? Who uses these senior languages these days and are they even used anymore? Do you have any experiences with one of the languages or do you know something about their latest developments?

    Read the article

  • is there a way to switch bash or zsh from emacs mode to vi mode with a keystroke

    - by Brandon
    I'd like to be able to switch temporarily from emacs mode to vi mode, since vi mode is sometimes better, but I'm usually half-way through typing something before I realize I want I don't want to switch permanently to vi mode, because I normally prefer emacs mode on the command line, mostly because it's what I'm used to, and over the years many of the keystrokes have become second nature. (As an editor I generally use emacs in viper mode, so that I can use both vi and emacs keystrokes, since I found myself accidentally using them in vi all the time, and screwing things up, and because in some cases I find vi keystrokes more memorable and handy, and in other cases emacs.)

    Read the article

  • Why is RAISERROR misspelled? Or is it not?

    - by Jason
    Why isn't RAISERROR spelled RAISEERROR? Where is the second E? I could understand if it were some ancient keyword length constraint, but I wouldn't expect it to be a nine-character limit. Is RAIS or RROR a technical word such that "raise-error" is just a mis-reading? Are its (immediate) origins in a different language? I've searched Google but not finding much on the subject.

    Read the article

  • What is the purpose of Java's unary plus operator?

    - by Syntactic
    Java's unary plus operator appears to have come over from C, via C++. As near as I can tell, it has the following effects: promotes its operand to int, if it's not already an int or wider unboxes its operand, if it's a wrapper object complicates slightly the parsing of evil expressions containing large numbers of consecutive plus signs It seems to me that there are better (or, at least, clearer) ways to do all of these things. In this SO question, concerning the counterpart operator in C#, someone said that "It's there to be overloaded if you feel the need." But in Java, one cannot overload any operator. So does this operator exist in Java just because it existed in C++?

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >