Search Results

Search found 21838 results on 874 pages for 'long double'.

Page 508/874 | < Previous Page | 504 505 506 507 508 509 510 511 512 513 514 515  | Next Page >

  • Why are "Inverted Colors" considered an accessibility feature? [migrated]

    - by RLH
    Why is it that in Apple software (OS X and iOS,) the "Inverted Colors" display feature is considered an accessibility option? I understand that some users are color-blind. This would justify the Black & White, or grey-scale modes. What I don't understand is how or why does inverting the display color help someone with any specific, visual impairment or dysfunction. As a programmer that wants to understand the need so that I can develop better, accessible software, what purpose does this feature serve to the end user who has some form of visual impairment? NOTE: I felt that this was a hard question to categorize on StackExchange. I settled here on Programmers because I assume that questions of accessibility are important to all developers and this question sits somewhere in the middle between topics that StackOverflow and SuperUser may cover. Also, this question isn't specific to Apple software. I've just noticed that this feature has been available on Macs for a very long time, it's a feature on iOS, and it's always associated with the Accessibility settings. If I can garner some information regarding the needs of some users, I think that I can develop better, accessible software.

    Read the article

  • Should I not show all my skills?

    - by Cracker
    I have been programming for a very long time and I have in depth knowledge of several technologies. Recently I applied for a web development job and in my resume I had listed all the skills - HTML, CSS, JavaScript, jQuery, AJAX, PHP, ASP, JSP, C/C++, ARM. Except for C/C++ and ARM I had shown the skill level for all technologies as expert. Many of my friends had applied for the same job and they did not have any web development experience. ALL of them got a call for interview. However I got a rejection saying that we have received applications from very high level candidates and you have not be selected to go to the next level. This has seriously demotivated me. I do not understand why I have been rejected when I had all the required skills and all those who did not have any of the skills have been selected. One reason which I think is that the employer might be thinking that how one person can be an expert in all the technologies. Once in another interview I was told by the HR manager that it is unbelievable that you know ASP, JSP and PHP all in depth as we have different programmers for each of the technology. Such incidents make me very unhappy as in spite of being highly capable of the position I am rejected. Should I not list all my skills in the resume to avoid such situations?

    Read the article

  • How to search a text file for strings between two tokens in Ubuntu terminal and save the output?

    - by Blue
    How can I search a text file for this pattern in Ubuntu terminal and save the output as a text file? I'm looking for everything between the string "abc" and the string "cde" in a long list of data. For example: blah blah abc fkdljgn cde blah blah blah blah blah blah abc skdjfn cde blah In the example above I would be looking for an output such as this: fkdljgn skdjfn It is important that I can also save the data output as a text file. Can I use grep or agrep and if so, what is the format?

    Read the article

  • Ubuntu doesn't start and I can't login [migrated]

    - by Meph00
    My ubuntu 13.04 doesn't boot anymore. Eternal black screen. If I press ALT+CTRL+F1 I see that it's stucked on "Checking battery state [OK]." I'd like to try to go with sudo apt-get install gdm but I can't login on terminal tty2, tty3 etc. They correctly ask for my nickname, then they make me wait a lot, ask for password and make me wait again. After a lot of time (... a lot) the best I could achieve was visualizing "Documentantion https://help.ubuntu.com". I can never reach the point where I can give commands. Plus, during the long pauses, every 2 minutes it gives a messagge like this: INFO: task XXX blocked for more than 120 seconds. Any suggestion? Sorry for my bad english and thanks everyone for the attention.

    Read the article

  • Upgrading to 12.10 on an external hard drive

    - by Tom Childers
    I did some googling on this and didn't find anything specific for my situation. I currently have 12.04 installed on an external USB hard drive. It's working great. I want to upgrade it to 12.10. My bandwidth is very limited so I have a friend who will download 12.10 for me and put it on a flash stik. Then I can upgrade without having to do the download myself. Which particular version of the 12.10 download file(s) should I get? Are there alternate 12.10 downloads that have all the packages? How do I set it up so when I upgrade 12.04 I can specify that it look in some local repository for the 12.10 files? Can I just dump the 12.10 files in some local directory? Or do I have do go thru some complex commands to create a local repository? I'm pretty new to Linux so a long process of complex terminal commands will probably be a show stopper for me. Remember that my 12.04 install resides on an external hard drive. And I have a laptop with multiple USB ports. Thanks! Advait

    Read the article

  • Why isn't VIM storing macros across sessions?

    - by dotancohen
    In VIM 7.3 on Ubuntu Server 12.04.1, VIM forgets macros and registers after closing. I do have set nocompatible in .vimrc and the command :set viminfo? gives this result: viminfo='100,<50,s10,h What might be preventing the macros and registers from being stored across close / open? Note that I am not interested in storing mappings for long term use in .vimrc. Rather, sometimes (such as during refactoring) I need to perform a simple operation on a few files and I find it easier to do in VIM than with Perl. I just need the macros and registers stored across open / close, which I do have working on other servers. Thanks.

    Read the article

  • Xorg crashes ever since using Nvidia dual monitors

    - by legion
    Well ever since I have used dual monitors with the NVIDIA X Server Settings program my xorg process has been crashing after a while, and its generally a pretty long while of like 6 hours afterwards. Before NVIDIA changed my xorg.conf file I only had xorg crash like twice in 2 months, I can't figure out what is going on. I am running ubuntu 12.04 with the MATE desktop environment v 1.2.0 xorg.conf # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 295.33 (buildd@allspice) Fri Mar 30 15:25:24 UTC 2012 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "LEN" HorizSync 51.8 - 55.8 VertRefresh 40.0 - 60.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "NVS 3100M" EndSection Section "Screen" # Removed Option "TwinView" "0" # Removed Option "metamodes" "DFP-0: nvidia-auto-select +0+0" # Removed Option "TwinView" "1" # Removed Option "metamodes" "DFP-1: nvidia-auto-select +0+0, CRT: nvidia-auto-select +1920+0" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinViewXineramaInfoOrder" "DFP-1" Option "TwinView" "0" Option "metamodes" "DFP-0: nvidia-auto-select +0+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • BSOD after PC has been running for a while

    - by user1389999
    I'm having a problem where my computer is getting a blue screen. I have noticed that the BSOD happens after the PC has been running for about 2 days. The BSOD's seem to be an error with atikmpag.sys with an error code of 0x00000116. The problem started about a month ago.It has happened all five times that I left my computer on that long in the past month. Because my computer is a pre-built one from Dell, and I had upgraded the graphics card (at least a year ago) to a more demanding one, I replaced the stock 360W power supply that came with it. I replaced it with a more powerful, 680W one, because it seemed like the problem could be related to a lack of power supply wattage, but it didn't affect the problem at all. Here are the minidump files for the five BSOD's that I have experienced: https://dl.dropbox.com/u/3488338/bsoddumps.zip System info: Windows 7 Home Premium x64 Dell Studio XPS 435MT Radeon HD 5670 (Version 12.4 of the Catalyst driver) Intel Core i7 920 2.67 GHz 6GB of RAM

    Read the article

  • Do you write unit tests for all the time in TDD?

    - by mcaaltuntas
    I have been designing and developing code with TDD style for a long time. What disturbs me about TDD is writing tests for code that does not contain any business logic or interesting behaviour. I know TDD is a design activity more than testing but sometimes I feel it's useless to write tests in these scenarios. For example I have a simple scenario like "When user clicks check button, it should check file's validity". For this scenario I usually start writing tests for presenter/controller class like the one below. @Test public void when_user_clicks_check_it_should_check_selected_file_validity(){ MediaService service =mock(MediaService); View view =mock(View); when(view.getSelectedFile).thenReturns("c:\\Dir\\file.avi"); MediaController controller =new MediaController(service,view); controller.check(); verify(service).check("c:\\Dir\\file.avi"); } As you can see there is no design decision or interesting code to verify behaviour. I am testing values from view passed to MediaService. I usually write but don't like these kind of tests. What do yo do about these situations ? Do you write tests for all the time ? UPDATE : I have changed the test name and code after complaints. Some users said that you should write tests for the trivial cases like this so in the future someone might add interesting behaviour. But what about “Code for today, design for tomorrow.” ? If someone, including myself, adds more interesting code in the future the test can be created for it then. Why should I do it now for the trivial cases ?

    Read the article

  • SSH only works after intentionally failed password

    - by pyraz
    So, I'm having a rather weird problem. I have a server, that when I try to SSH into, immediately closes the connection if I type in the correct password on the first attempt. However, if I purposefully enter a wrong password on the first attempt, and then enter a correct password at the second or third prompt, it successfully logs me into the computer. Similarly, when I try to use public key authentication, I get an immediate closed connection. If, however, I enter a wrong password for my key file, followed by another wrong password once it reverts to password authentication, I can successfully log in as long as I provide the correct password at the second or third prompt. The machine is running Red Hat Enterprise Linux Server release 6.2 (Santiago), and is using LDAP and PAM for authentication. Any ideas on where to start debugging this one? Let me know what config files I need to provide and I'll be happy to do so.

    Read the article

  • c# vocabulary

    - by foxjazz
    I have probably seen and used the word Encapsulation 4 times in my 20 years of programming.I now know what it is again, after an interview for a c# job. Even though I have used the public, private, and protected key words in classes for as long as c# was invented. I can sill remember coming across the string.IndexOf function and thinking, why didn't they call it IndexAt.Now with all the new items like Lambda and Rx, Linq, map and pmap etc, etc. I think the more choices there is to do 1 or 2 things 10 or 15 differing ways, the more programmers think to stay with what works and try and leverage the new stuff only when it really becomes beneficial.For many, the new stuff is harder to read, because programmers aren't use to seeing declarative notation.I mean I have probably used yield break, twice in my project where it may have been possible to use it many more times. Or the using statement ( not the declaration of namespace references) but inline using. I never really saw a big advantage to this, other than confusion. It is another form of local encapsulation (oh there 5 times used in my programming career) but who's counting?  THE COMPUTERS ARE COUNTING!In business logic most programming is about displaying lists, selecting items in a list, and sending those choices to some other system or database to keep track of those selections. What makes this difficult is how these items relate to one, each other, and two externally listed items.Well I probably need to go back to school and learn c# certification so I can say I am an expert in c#. Apparently using all aspects of c# (even unsafe code) in my programming life, doesn't make me certified, just certifiable.This is a good time to sign off:Fox-jazzy

    Read the article

  • MySQL slow query log logging all queries

    - by Blanka
    We have a MySQL 5.1.52 Percona Server 11.6 instance that suddenly started logging every single query to the slow query log. The long_query_time configuration is set to 1, yet, suddenly we're seeing every single query (e.g. just saw one that took 0.000563s!). As a result, our log files are growing at an insane pace. We just had to truncate a 180G slow query log file. I tried setting the long_query_time variable to a really large number to see if it stopped altogether (1000000), but same result. show global variables like 'general_log%'; +------------------+--------------------------+ | Variable_name | Value | +------------------+--------------------------+ | general_log | OFF | | general_log_file | /usr2/mysql/data/db4.log | +------------------+--------------------------+ 2 rows in set (0.00 sec) show global variables like 'slow_query_log%'; +---------------------------------------+-------------------------------+ | Variable_name | Value | +---------------------------------------+-------------------------------+ | slow_query_log | ON | | slow_query_log_file | /usr2/mysql/data/db4-slow.log | | slow_query_log_microseconds_timestamp | OFF | +---------------------------------------+-------------------------------+ 3 rows in set (0.00 sec) show global variables like 'long%'; +-----------------+----------+ | Variable_name | Value | +-----------------+----------+ | long_query_time | 1.000000 | +-----------------+----------+ 1 row in set (0.00 sec)

    Read the article

  • Mail sent from local Postfix marked as "possible phishing" in Outlook

    - by leo grrr
    Hi folks, Sorry for the newbie question--this is not my area of expertise by a long shot. I work at a small development shop and we finally got around to doing code reviews. (Yay!) I set up an instance of Review Board -- an open-source code review tool -- on one of our local servers but it doesn't seem to like talking to our hosted Exchange server to send notification emails. I decided to just install Postfix on that same box and send mail from localhost, which is working much more reliably, but Outlook disables all links in the email announcements and marks it as possible phishing. What is making these emails look suspicious and what can I change? Would the best thing be to figure out how to relay to Exchange from Postfix? Thanks!

    Read the article

  • Selection Issues with a PDF from a Word document

    - by syrion
    I have a long Word document that has a running footer. When I try to copy and paste across pages in the PDF generated from this document, the behavior of this footer is unpredictable--sometimes it is unselected, sometimes it is selected, sometimes the footer on the next page is selected. I would prefer to make this portion of the document unselectable, so that it still shows up but doesn't interfere with copying and pasting. Does anyone have an idea of how to do this? No, changing it to an image isn't possible, because it includes a page number.

    Read the article

  • Are there any good examples of open source C# projects with a large number of refactorings?

    - by Arjen Kruithof
    I'm doing research into software evolution and C#/.NET, specifically on identifying refactorings from changesets, so I'm looking for a suitable (XP-like) project that may serve as a test subject for extracting refactorings from version control history. Which open source C# projects have undergone large (number of) refactorings? Criteria A suitable project has its change history publicly available, has compilable code at most commits and at least several refactorings applied in the past. It does not have to be well-known, and the code quality or number of bugs is irrelevant. Preferably the code is in a Git or SVN repository. The result of this research will be a tool that automatically creates informative, concise comments for a changeset. This should improve on the common development practice of just not leaving any comments at all. EDIT: As Peter argues, ideally all commit comments would be teleological (goal-oriented). Practically, if a comment is made at all it is often descriptive, merely a summary of the changes. Sadly we're a long way from automatically inferring developer intentions!

    Read the article

  • Accidentally deleted the software for MyPassport Essential SE 1TB Hardrive

    - by user26192
    I'm posting for a friend of mine. She bought a WD MyPassport Essential SE 1 TB Hard drive the other day. When she plugged in the USB in her lap top, the driver cannot be recognized by the smart ware software. While she was doing a back up of her files, McAfee was running in the background. Since the backup was taking so long to finish, she decided to pause it. She tried to delete the partially backed up files, but instead, she accidentally deleted the entire file in the folder including the pre-installed software. Now, when she tries to start up the MyPassport, the smart ware doesn't show up anymore. Can someone please give us advice what can she do about this? Thank you.

    Read the article

  • Pipe an infinite stream to internal loop?

    - by Sh3ljohn
    I've seen a lot of things about redirecting stdout to a TCP socket, but no real example of how to do it in practice, specifically when the output stream generated by the first "command" never ends. To talk about something concrete, let's take programs like servers that typically output their log endlessly to stdout (well, as long as they run). If you redirect the output to a log file on the disk, then this file is always open (therefore not readable by others?) and grows infinitely, which eventually is going to cause problems. This might be a nood question, but I don't know what it does or how to do it so. How to redirect the output of a command to the internal loop? I want to make sure that data is sent EVERY time something is written to stdout, and that the pipe won't wait for the command to end (never happens ideally!). Is that right? If 2 is true, is there a buffer system to send chunks of data once it reaches a certain size only? Could you give me concrete command line examples to do the above? Thanks in advance

    Read the article

  • How do you proactively guard against errors of omission?

    - by Gabriel
    I'll preface this with I don't know if anyone else who's been programming as long as I have actually has this problem, but at the very least, the answer might help someone with less xp. I just stared at this code for 5 minutes, thinking I was losing my mind that it didn't work: var usedNames = new HashSet<string>(); Func<string, string> l = (s) => { for (int i = 0; ; i++) { var next = (s + i).TrimEnd('0'); if (!usedNames.Contains(next)) { return next; } } }; Finally I noticed I forgot to add the used name to the hash set. Similarly, I've spent minutes upon minutes over omitting context.SaveChanges(). I think I get so distracted by the details that I'm thinking about that some really small details become invisible to me - it's almost at the level of mental block. Are there tactics to prevent this? update: a side effect of asking this was fixing the error it would have for i 9 (Thanks!) var usedNames = new HashSet<string>(); Func<string, string> name = (s) => { string result = s; if(usedNames.Contains(s)) for (int i = 1; ; result = s + i++) if (!usedNames.Contains(result)) break; usedNames.Add(result); return result; };

    Read the article

  • What would cause a query being ran from SSMS on local box to run slower then from remote box

    - by Racter
    When I run a simply query such as "Select Column1, Column2 from Table A" from within SSMS running on my production SQL Server the results seems to take extremely long (45Min). If I run the same query from my dev system’s SSMS connecting to the production SQL Server the results return within a few seconds (<60sec). One thing I have notices is if the system was just rebooted performance is good for a bit. It is hard to determine a time as I have had it start running slow very quickly after reboot but at most it performed good for 20min and then start acting up. Also, just restarting the SQL service does not resolve the issue or provide a temporary performance boost. Specs for Server are: Windows Server 2003, Enterprise Edition, SP2 4 X Intel Xeon 3.6GHz - 6GB System Memory Active/Active Cluster SQL Server 2005 SP2 (9.0.3239)

    Read the article

  • Back in Town and Ready for New Beginnings

    - by MOSSLover
    Originally posted on: http://geekswithblogs.net/MOSSLover/archive/2013/11/03/back-in-town-and-ready-for-new-beginnings.aspxI just took a super long trip that lasted from September 27th until today.  I flew into St. Louis and then rented a car and drove over 12,000 miles.  I just dropped the rental off last night.  I went to a ton of states, did a lot of really cool things, saw a lot of really cool people, and bought a ton of beer.  I made some decisions, but this post isn't really about my decisions.  It's more about the question that everyone has been asking, "Where am I going to work?".So here's the answer...BlueMetal Architects as a Senior SharePoint Engineer.  Here is their website: http://www.bluemetal.com/.  I basically start tomorrow.  I didn't want to post anything super early, because I didn't want to jinx things.  I am really excited.  Now that I'm back I'm hoping that things will start to turn around for me.  I look forward to the future.

    Read the article

  • Using a CDN for CMS software (multiple sites)

    - by SmokeyPHP
    I'm currently researching ideas for the media management side of a CMS I'm writing. I was looking at having images served from a CDN which is fine on a single site, but I want all sites that run the CMS to make use of a CDN (which will most likely be a custom developed one, rather than a third party service like S3). My main question is: Is a multi-site CDN a good idea? I can't think of a downside, but have probably missed something - obviously they won't share the same folder, as I invisage the requests to be css.cdnsite.com/example.com/style.css or something along those lines. Having multiple sites in the same place will obviously make it easier for us to manage, as well as being cheaper, but then I wonder if it'll be worth it... Long story short: How should the CMS handle user uploaded media (separate installations) Just keep a local copy of all assets and serve them from the same site, like in days of yore? Keep a local copy, force site to use www. and have CDN subdomains per site? Or use a single separate CDN for all sites? Apologies for the length of this question, not sure if this should be multiple questions or not, as all parts are kind of related and could affect each other.

    Read the article

  • HAProxy being killed with more that 54,000 connections

    - by Olly
    I am trying to run HAProxy (1.4.8) on a EC2 machine running Ubuntu 10.04. I need HAProxy to be able to handle many thousands of long-running persistent connections (websockets). With the current setup HAProxy gets killed at around 54,300 connections (roughly). If I am running HAProxy in the foreground, the only output is "Killed". Am I right in thinking this is the Kernel killing the process? Is this because it is out of resources? Can I increase the resources? The CPU and memory consumption are low with 50,000 connections, so I don't suspect either of these. How can I prevent this from happening?

    Read the article

  • Create a system image in Windows 8

    - by Greg Low
    One of the things that I've just come to accept is that the designers of Windows 8 and I think very differently.It'll take a long time to convince me that shutting down the computer is a "setting". Even after using Windows 8 for quite a while now, I still find that I struggle nearly every day, just trying to do things that I previously knew how to do. That's just not a good thing.Today I decided to create a system image as I hadn't made one lately. I started in Control Panel looking for Backup options. That yielded nothing except programs that wanted to "Save backup copies of my files with file history". I thought "oh well, let's just try the new search options". I hit the Windows key and typed "Backup". No, nothing came up there either.I searched again all over the Control Panel options to no avail.So it was time to hit Google again. Once again, clearly lots of people used to know how to do this and have been trying to work out where this option went.The first trick is that there are a bunch of Control Panel options that don't appear in the Control Panel. In the address bar at the top, if you click on Control Panel, you'll find there is an option that says "All Control Panel Options". That is curious given that's where I thought I was when I opened Control Panel. No hint is given on that screen that there are a bunch of hidden options. None the less, I then checked out "all" the options.The option that you need to create a system image in Windows 8 turns out to be the "Windows 7 File Recovery" option that appears in this extended list. Why does it say "Windows 7" when it's for "Windows 8" as well and I'm running "Windows 8"? Why do I have to choose an option that says "File Recovery" to create a system image backup?<sigh>But at least I've recorded it here for the next time I forget where to find it.

    Read the article

  • How do I install yum on Redhat Enterprise 4?

    - by Bob Cross
    For historical reasons, one of the machines that I manage has a Redhat Enterprise 4 boot disk (among others). Every now and then, we have to boot into RHEL4 to bring up some of the legacy software that we support and connect to. Since it's a fringe system, the Redhat support has long since lapsed and I can't convince myself that it would be worth paying just to get RPMs that I can go and get for myself. That said, the default RHEL tools are heavily biased against letting you do exactly that. I would like to install yum and use that as my package discovery and installation. So, is there an installation guide to integrating yum with an older RHEL 4 system?

    Read the article

  • Performance impact of Zones.

    - by nospam(at)example.com (Joerg Moellenkamp)
    I was really astonished when i saw this question. Because this question was a old acquaintance from years ago, that i didn't heard for a long time. However there was it again. The question: "What's the overhead of Zones?". Sun was and Oracle is not saying "zero". We saying saying minimal. However during all the performance analysis gigs on customer systems i made since the introduction of Zones i failed to measure any overhead caused by zones. What i saw however, was additional load intoduced by processes that wouldn't be there when you would use only one zone Like additional monitoring daemons, like additional daemons having a controlling or supervising job for the application that resulted in slighly longer runtimes of processes, because such additional daemons wanted some cycles on the CPU as well. So i ask when someone wants to tell me that he measured a slight slowdown, if he or she has really measured the impact of the virtualization layer or of a side effect described above. It seems to be a little bit hard to believe, that a virtualisation technology has no overhead, however keep in mind that there is no hypervisor and just one kernel running that looks and behaves like many operating system instances to apps and users. While this imposes some limits to the technology (because there is just one kernel running you can't have zones with different kernels versions running ... obvious even to the cursory observer), but that is key to it's lightweightness and thus to the low overhead. Continue reading "Performance impact of Zones."

    Read the article

< Previous Page | 504 505 506 507 508 509 510 511 512 513 514 515  | Next Page >