Search Results

Search found 405 results on 17 pages for 'what the crap'.

Page 1/17 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • T-SQL Tuesday #21 - Crap!

    - by Most Valuable Yak (Rob Volk)
    Adam Machanic's (blog | twitter) ever popular T-SQL Tuesday series is being held on Wednesday this time, and the topic is… SHIT CRAP. No, not fecal material.  But crap code.  Crap SQL.  Crap ideas that you thought were good at the time, or were forced to do due (doo-doo?) to lack of time. The challenge for me is to look back on my SQL Server career and find something that WASN'T crap.  Well, there's a lot that wasn't, but for some reason I don't remember those that well.  So the additional challenge is to pick one particular turd that I really wish I hadn't squeezed out.  Let's see if this outline fits the bill: An ETL process on text files; That had to interface between SQL Server and an AS/400 system; That didn't use SSIS (should have) or BizTalk (ummm, no) but command-line scripting, using Unix utilities(!) via: xp_cmdshell; That had to email reports and financial data, some of it sensitive Yep, the stench smell is coming back to me now, as if it was yesterday… As to why SSIS and BizTalk were not options, basically I didn't know either of them well enough to get the job done (and I still don't).  I also had a strict deadline of 3 days, in addition to all the other responsibilities I had, so no time to learn them.  And seeing how screwed up the rest of the process was: Payment files from multiple vendors in multiple formats; Sent via FTP, PGP encrypted email, or some other wizardry; Manually opened/downloaded and saved to a particular set of folders (couldn't change this); Once processed, had to be placed BACK in the same folders with the original archived; x2 divisions that had to run separately; Plus an additional vendor file in another format on a completely different schedule; So that they could be MANUALLY uploaded into the AS/400 system (couldn't change this either, even if it was technically possible) I didn't feel so bad about the solution I came up with, which was naturally: Copy the payment files to the local SQL Server drives, using xp_cmdshell Run batch files (via xp_cmdshell) to parse the different formats using sed, a Unix utility (this was before Powershell) Use other Unix utilities (join, split, grep, wc) to process parsed files and generate metadata (size, date, checksum, line count) Run sqlcmd to execute a stored procedure that passed the parsed file names so it would bulk load the data to do a comparison bcp the compared data out to ANOTHER text file so that I could grep that data out of the original file Run another stored procedure to import the matched data into SQL Server so it could process the payments, including file metadata Process payment batches and log which division and vendor they belong to Email the payment details to the finance group (since it was too hard for them to run a web report with the same data…which they ran anyway to compare the emailed file against…which always matched, surprisingly) Email another report showing unmatched payments so they could manually void them…about 3 months afterward All in "Excel" format, using xp_sendmail (SQL 2000 system) Copy the unmatched data back to the original folder locations, making sure to match the file format exactly (if you've ever worked with ACH files, you'll understand why this sucked) If you're one of the 10 people who have read my blog before, you know that I love the DOS "for" command.  Like passionately.  Like fairy-tale love.  So my batch files were riddled with for loops, nested within other for loops, that called other batch files containing for loops.  I think there was one section that had 4 or 5 nested for commands.  It was wrong, disturbed, and completely un-maintainable by anyone, even myself.  Months, even a year, after I left the company I got calls from someone who had to make a minor change to it, and they called me to talk them out of spraying the office with an AK-47 after looking at this code.  (for you Star Trek TOS fans) The funniest part of this, well, one of the funniest, is that I made the deadline…sort of, I was only a day late…and the DAMN THING WORKED practically unchanged for 3 years.  Most of the problems came from the manual parts of the overall process, like forgetting to decrypt the files, or missing/late files, or saved to the wrong folders.  I'm definitely not trying to toot my own horn here, because this was truly one of the dumbest, crappiest solutions I ever came up with.  Fortunately as far as I know it's no longer in use and someone has written a proper replacement.  Today I would knuckle down and do it in SSIS or Powershell, even if it took me weeks to get it right. The real lesson from this crap code is to make things MAINTAINABLE and UNDERSTANDABLE.  sed scripting regular expressions doesn't fit that criteria in any way.  If you ever find yourself under pressure to do something fast at all costs, DON'T DO IT.  Stop and consider long-term maintainability, not just for yourself but for others on your team.  If you can't explain the basic approach in under 5 minutes, it ultimately won't succeed.  And while you may love to leave all that crap behind, it may follow you anyway, and you'll step in it again.   P.S. - if you're wondering about all the manual stuff that couldn't be changed, it was because the entire process had gone through Six Sigma, and was deemed the best possible way.  Phew!  Talk about stink!

    Read the article

  • what does "crap" mean in samba logs

    - by Tim Cronin
    Hi All, I have been googling and googling and can't find a conclusive answer. In log files for samba, I see things like the following: "[11560]: pam auth crap domain:" & "NTLM CRAP authentication for user" I'm hoping this stands for something like "Challenge Response Auth Protocol" or something, but when I show the logs to people that aren't technical, I usually get questions or looks. Anything that anyone knows about this will greatly help. Thanks, Tim

    Read the article

  • Firefox 17 and CSS borders based triangles = crap

    - by Adeher
    Like many front-end devs, I've been using the border trick to render triangles in CSS. http://apps.eky.hk/css-triangle-generator/ this generator helps with the technique. Today, like every week (almost), the Firefox team released a new version without any clear changelog on the rendering engine. Now we can see an ungraceful gray border around those triangles. I haven't found a trick to get rid of it yet. On top of that, before Firefox 17, when people were complaining about how aliased those triangles looked, an additional trick was to set the border-style property to "dashed" instead of solid. Using firebug on the triangle generator, you can quickly see how it shows up now, and cry yourself to death. I hope a more brillant front-end dev than myself will find a fix for this and share it here. Thanks

    Read the article

  • Crap, hard disk failure. Can I recover my "move"d folders?

    - by Doug
    I am in the process of moving all my files from an old laptop to new one. I just moved 11gb of data from my old laptop to a hard drive (external) and then upon moving it out to the new hard drive, the hard drive is getting a CRC (Data Error (Cyclic Redundancy Check). Now I am looking for a solution to recover the files that I moved on my old laptop (not the external). I understand they they are just marked for potential overwriting to free up space. I was getting ready to test out GetDataBack, but it says to install it on a healthy windows and use the recover-needed drive as an external. However, I don't want to turn off my computer without first getting the okay since it is in a "moved" state. Please help! What can I do to recover the Moved files. I haven't touched the computer since it has been moved. What can I use to recover them?

    Read the article

  • Playing around with my Java project in Eclipse... what the crap did I just do?

    - by Daddy Warbox
    I don't even remember how, but somehow I managed to make all of my project's source files hidden in Eclipse's Package and Project Explorer panels. Go figure. 'Show Filtered Children (alt+click)' temporarily reveals the files, and only in Package Explorer can I double-click to reopen them from this view. They go back into hiding after I select another item, though. Plus, now I'm getting other annoyances, such as all of the folded non-hidden items altogether expanding when I click on an item, and the entire folder tree of my project now being shown in these panels (including my .svn subversion folders... which shouldn't be any of Eclipse's business, presently). Long story short, my Package/Project Explorers' just blew up on me, and I want to know how to fix this. Thanks in advance. P.S. What's a good guide I can use to learn my way around this silly contraption, anyway?

    Read the article

  • How to grant write permissions in Samba?

    - by Eric Fossum
    I'm having trouble with read/write permissions on my Samba server, how do I fix my smb.conf and file permissions to have a more unified access? smb.conf [global] workgroup = workgroup netbios name = LnxNAS server string = %h wins support = no dns proxy = no security = user encrypt passwords = yes panic action = /usr/share/samba/panic-action %d [homes] comment = Home Directories [Video] path = /data/eric/Videos [Music] path = /data/eric/Music [Pictures] path = /data/eric/Pictures [data] path = /data my ls -l of /data/eric/Pictures drwxrwxrwx 2 ericfoss root 4096 2011-03-13 22:09 Android Projs drwxrwxrwx 3 ericfoss root 4096 2011-03-13 22:09 Automotive -rwxrwxrwx 1 ericfoss root 2439 2010-12-17 17:03 BDD reduction.png -rwxrwxrwx 1 ericfoss root 2722 2010-12-17 16:55 BDD Tree.png -rwxrwxrwx 1 ericfoss root 7341 2010-12-17 16:46 BDD Tree.xcf -rwxrwxrwx 1 ericfoss root 72421 2007-11-22 22:59 Bum Ninja.jpg -rwxrwxrwx 1 ericfoss root 32152 2010-12-17 21:25 cell transition.png -rwxrwxrwx 1 ericfoss root 40212 2010-12-17 17:55 control graph.png drwxrwxrwx 2 ericfoss root 4096 2011-03-13 22:09 Crap -rwxrwxrwx 1 ericfoss root 82 2010-09-20 17:18 desktop.ini ericfoss@SERVER:~$ If I try to delete \Server\Pictures\Crap it says permission denied, but \Server\data\eric\Pictures\crap can be deleted... I thought security = user took care of this?

    Read the article

  • JAXB adding namespace to parent but not to the child elements contained

    - by Nishant
    I put together an XSD and used JAXB to generate classes out of it. Here are my XSDs- myDoc.xsd : <?xml version="1.0" encoding="UTF-8"?> <xs:schema xmlns="http://www.mydoc.org" targetNamespace="http://www.mydoc.org" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:mtp="http://www.mytypes.com" elementFormDefault="qualified"> <xs:import namespace="http://www.mytypes.com" schemaLocation="mytypes.xsd" /> <xs:element name="myDoc"> <xs:complexType> <xs:sequence> <xs:element name="crap" type="xs:string"/> <xs:element ref="mtp:foo"/> <xs:element ref="mtp:bar"/> </xs:sequence> </xs:complexType> </xs:element> mytypes.xsd <?xml version="1.0" encoding="UTF-8"?> <xs:schema targetNamespace="http://www.mytypes.com" xmlns="http://www.mytypes.com" xmlns:tns="http://www.mytypes.com" xmlns:xs="http://www.w3.org/2001/XMLSchema" attributeFormDefault="qualified" elementFormDefault="qualified"> <xs:element name="foo" type="tns:Foo"/> <xs:element name="bar" type="tns:Bar"/> <xs:element name="spam" type="tns:Spam"/> <xs:simpleType name="Foo"> <xs:restriction base="xs:string"></xs:restriction> </xs:simpleType> <xs:complexType name="Bar"> <xs:sequence> <xs:element ref="spam"/> </xs:sequence> </xs:complexType> <xs:simpleType name="Spam"> <xs:restriction base="xs:string" /> </xs:simpleType> </xs:schema> The document marshalled is- <?xml version="1.0" encoding="UTF-8" standalone="yes"?> <myDoc xmlns:ns2="http://www.mytypes.com"> <crap>real crap</crap> <ns2:foo>bleh</ns2:foo> <ns2:bar> <spam>blah</spam> </ns2:bar> </myDoc> Note that the <spam> element uses the default namespace. I would like it to use the ns2 namespace. The schema (mytypes.xsd) expresses the fact that <spam> is contained within <bar> which in the XML instance is bound to the ns2 namespace. I've broken my head over this for over a week and I would like ns2 prefix to appear in <spam>. What should I do? Required : <?xml version="1.0" encoding="UTF-8" standalone="yes"?> <myDoc xmlns:ns2="http://www.mytypes.com"> <crap>real crap</crap> <ns2:foo>bleh</ns2:foo> <ns2:bar> <ns2:spam>blah</ns2:spam><!--NS NS NS--> </ns2:bar> </myDoc>

    Read the article

  • How to do call function after client finishes download from tornado web server?

    - by Shabbyrobe
    I would like to be able to run some cleanup functions if and only if the client successfully completes the download of a file I'm serving using Tornado. I installed the firefox throttle tool and had it slow the connection down to dialup speed and installed this handler to generate a bunch of rubbish random text: class CrapHandler(BaseHandler): def get(self, token): crap = ''.join(random.choice(string.ascii_uppercase + string.digits) for x in range(100000)) self.write(crap) print "done" I get the following output from tornado immediately after making the request: done I 100524 19:45:45 web:772] 200 GET /123 (192.168.45.108) 195.10ms The client then plods along downloading for about 20 seconds. I expected that it would print "done" after the client was done. Also, if I do the following I get pretty much the same result: class CrapHandler(BaseHandler): @tornado.web.asynchronous def get(self, token): crap = ''.join(random.choice(string.ascii_uppercase + string.digits) for x in range(100000)) self.write(crap) self.finish() print "done" Am I missing something fundamental here? Can tornado even support what I'm trying to do? If not, is there an alternative that does?

    Read the article

  • JAXB adding namespace to parent but not to the child elements contained

    - by Nishant
    I put together an XSD and used JAXB to generate classes out of it. Here are my XSDs- myDoc.xsd : <?xml version="1.0" encoding="UTF-8"?> <xs:schema xmlns="http://www.mydoc.org" targetNamespace="http://www.mydoc.org" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:mtp="http://www.mytypes.com" elementFormDefault="qualified"> <xs:import namespace="http://www.mytypes.com" schemaLocation="mytypes.xsd" /> <xs:element name="myDoc"> <xs:complexType> <xs:sequence> <xs:element name="crap" type="xs:string"/> <xs:element ref="mtp:foo"/> <xs:element ref="mtp:bar"/> </xs:sequence> </xs:complexType> </xs:element> mytypes.xsd <?xml version="1.0" encoding="UTF-8"?> <xs:schema targetNamespace="http://www.mytypes.com" xmlns="http://www.mytypes.com" xmlns:tns="http://www.mytypes.com" xmlns:xs="http://www.w3.org/2001/XMLSchema" attributeFormDefault="qualified" elementFormDefault="qualified"> <xs:element name="foo" type="tns:Foo"/> <xs:element name="bar" type="tns:Bar"/> <xs:element name="spam" type="tns:Spam"/> <xs:simpleType name="Foo"> <xs:restriction base="xs:string"></xs:restriction> </xs:simpleType> <xs:complexType name="Bar"> <xs:sequence> <xs:element ref="spam"/> </xs:sequence> </xs:complexType> <xs:simpleType name="Spam"> <xs:restriction base="xs:string" /> </xs:simpleType> </xs:schema> The document marshalled is- <?xml version="1.0" encoding="UTF-8" standalone="yes"?> <myDoc xmlns:ns2="http://www.mytypes.com"> <crap>real crap</crap> <ns2:foo>bleh</ns2:foo> <ns2:bar> <spam>blah</spam> </ns2:bar> </myDoc> Note that the <spam> element uses the default namespace. I would like it to use the ns2 namespace. The schema (mytypes.xsd) expresses the fact that <spam> is contained within <bar> which in the XML instance is bound to the ns2 namespace. I've broken my head over this for over a week and I would like ns2 prefix to appear in <spam>. What should I do? Required : <?xml version="1.0" encoding="UTF-8" standalone="yes"?> <myDoc xmlns:ns2="http://www.mytypes.com"> <crap>real crap</crap> <ns2:foo>bleh</ns2:foo> <ns2:bar> <ns2:spam>blah</ns2:spam><!--NS NS NS--> </ns2:bar> </myDoc>

    Read the article

  • Upstart script not working on Ubuntu

    - by Holy Crap
    I'm trying to write an upstart script to start orbited on startup. The script is as follows: #orbited start on startup stop on shutdown script exec /usr/local/bin/orbited --config=/etc/orbited.cfg end script When I run upstart orbited I get something like this: orbited start/running, process 605 But when I run status orbited right after doing that I get: orbited stop/waiting The script fails to start even though it says it's running. Any ideas? Thanks!

    Read the article

  • Detecting when Javascript is performing poorly

    - by what-the-crap
    I'm working on a webapp in jquery that, on older machines or machines without much resources, may perform poorly. To get around this I'd like to make a degraded version that disables some of the features, particularly those that rely on large images. How can I tell if my app is running poorly on the user's computer in jquery or javascript in general? I just need a way to call a function that will degrade the app. (especially when the user may run low on system memory) The only way I can think of is for manual user intervention, but the option would add clutter for users that don't need it and users that do need it may not notice it. Thanks!

    Read the article

  • .NET COM Interop on Windows 7 64Bit gives me a headache

    - by Kevin Stumpf
    Hey guys, .NET COM interop so far always has been working quite nicely. Since I upgraded to Windows 7 I don't get my .NET COM objects to work anymore. My COM object is as easy as: namespace Crap { [ComVisible(true)] [Guid("2134685b-6e22-49ef-a046-74e187ed0d21")] [ClassInterface(ClassInterfaceType.None)] public class MyClass : IMyClass { public MyClass() {} public void Test() { MessageBox.Show("Finally got in here."); } } } namespace Crap { [Guid("1234685b-6e22-49ef-a046-74e187ed0d21")] public interface IMyClass { } } assembly is marked ComVisible as well. I register the assembly using regasm /codebase /tlb "path" registers successfully (admin mode). I tried regasm 32 and 64bit. Both time I get the error "ActiveX component cant create object Crap.MyClass" using this vbscript: dim objReg Set objReg = CreateObject("Crap.MyClass") MsgBox typename(objReg) fuslogvw doesn't give me any hints either. That COM object works perfectly on my Vista 32 Bit machine. I don't understand why I haven't been able to google a solution for that problem.. am I really the only person that ever got into that problem? Looking at OleView I see my object is registered successfully. I am able to create other COM objects as well.. it only does not work with my own ones. Thank you, Kevin

    Read the article

  • Blog comment spammers &hellip; help!

    - by Steve Clements
    Hey, I need some advice/help, now I’ve been writing on this blog for a few years, I don’t blog a great deal in comparison to many of my peers and my blog doesn’t get a massive number of hits, BUT I seem to get a fair share of comments spammers!! I have an image verification on the comments and I have put an Akismet API key into the Geekswithblogs settings, but I still get a bunch of spam comments. They could almost be real, until you see the link going off to some dating site or casino crap. What does everyone do?? Delete them every time?? Moderate comments?? I would rather not moderate comments if possible, but is that the only way to stop the crap?? Thx Technorati Tags: spammers,comments,help

    Read the article

  • ubuntu is so damn frustrating! [on hold]

    - by Bob Kathy Mosca
    3 damn weeks I've been trying to follow all the suggestions I can find to get that stupid broadcom bcm4312 to work and still nothing! I think you ubuntu users have forgotten what its like to really know NOTHING about computers. You advice is next to worthless and you all seem to "assume" the new user is at least (what YOU consider) "computer literate" ...think again. No wonder that crap system from microsoft dominates! They know the public is stupid so their system does this crap for you! damn! what a pity too

    Read the article

  • How do you stop scripters from slamming your website hundreds of times a second?

    - by davebug
    [update] I've accepted an answer, as lc deserves the bounty due to the well thought-out answer, but sadly, I believe we're stuck with our original worst case scenario: CAPTCHA everyone on purchase attempts of the crap. Short explanation: caching / web farms make it impossible for us to actually track hits, and any workaround (sending a non-cached web-beacon, writing to a unified table, etc.) slows the site down worse than the bots would. There is likely some pricey bit of hardware from Cisco or the like that can help at a high level, but it's hard to justify the cost if CAPTCHAing everyone is an alternative. I'll attempt to do a more full explanation in here later, as well as cleaning this up for future searchers (though others are welcome to try, as it's community wiki). I've added bounty to this question and attempted to explain why the current answers don't fit our needs. First, though, thanks to all of you who have thought about this, it's amazing to have this collective intelligence to help work through seemingly impossible problems. I'll be a little more clear than I was before: This is about the bag o' crap sales on woot.com. I'm the president of Woot Workshop, the subsidiary of Woot that does the design, writes the product descriptions, podcasts, blog posts, and moderates the forums. I work in the css/html world and am only barely familiar with the rest of the developer world. I work closely with the developers and have talked through all of the answers here (and many other ideas we've had). Usability of the site is a massive part of my job, and making the site exciting and fun is most of the rest of it. That's where the three goals below derive. CAPTCHA harms usability, and bots steal the fun and excitement out of our crap sales. To set up the scenario a little more, bots are slamming our front page tens of times a second screenscraping (and/or scanning our rss) for the Random Crap sale. The moment they see that, it triggers a second stage of the program that logs in, clicks I want One, fills out the form, and buys the crap. In current (2/6/2009) order of votes: lc: On stackoverflow and other sites that use this method, they're almost always dealing with authenticated (logged in) users, because the task being attempted requires that. On Woot, anonymous (non-logged) users can view our home page. In other words, the slamming bots can be non-authenticated (and essentially non-trackable except by IP address). So we're back to scanning for IPs, which a) is fairly useless in this age of cloud networking and spambot zombies and b) catches too many innocents given the number of businesses that come from one IP address (not to mention the issues with non-static IP ISPs and potential performance hits to trying to track this). Oh, and having people call us would be the worst possible scenario. Can we have them call you? BradC Ned Batchelder's methods look pretty cool, but they're pretty firmly designed to defeat bots built for a network of sites. Our problem is bots are built specifically to defeat our site. Some of these methods could likely work for a short time until the scripters evolved their bots to ignore the honeypot, screenscrape for nearby label names instead of form ids, and use a javascript-capable browser control. lc again "Unless, of course, the hype is part of you

    Read the article

  • write c++ in latex, noob latex question

    - by voodoomsr
    maybe is a noob question but i can't find the solution in the web, i need to write C++ in Latex. I write C$++$ but the result is like crap, the signs are too big and there is too much space between C and the first plus sign. Previously i needed to write the sharp symbol for C#....c$\sharp$ it also looks like crap but with a escape character it looks nice, for the plus sign i can't do the same.

    Read the article

  • 301 Redirecting URLs based on GET variables in .htaccess

    - by technicalbloke
    I have a few messy old URLs like... http://www.example.com/bunch.of/unneeded/crap?opendocument&part=1 http://www.example.com/bunch.of/unneeded/crap?opendocument&part=2 ...that I want to redirect to the newer, cleaner form... http://www.example.com/page.php/welcome http://www.example.com/page.php/prices I understand I can redirect one page to another with a simple redirect i.e. Redirect 301 /bunch.of/unneeded/crap http://www.example.com/page.php But the source page doesn't change, only it's GET vars. I can't figure out how to base the redirect on the value of these GET variables. Can anybody help pls!? I'm fairly handy with the old regexes so I can have a pop at using mod-rewrite if I have to but I'm not clear on the syntax for rewriting GET vars and I'd prefer to avoid the performance hit and use the cleaner Redirect directive. Is there a way? and if not can anyone clue me in as to the right mod-rewrite syntax pls? Cheers, Roger.

    Read the article

  • Squid, NTLM, Windows 7 and IE8

    - by Harley
    I'm running Squid 2.7-stable4, Samba 3 and the Windows 7 RC with IE8. I have NTLM authentication setup on my squid proxy server and it works fine for every combination of browser and Windows (including IE8 on XP and Firefox on Win7), but it doesn't work (keeps asking for authentication) for IE8 on Windows 7. I can get it to work using the LmCompatibilityLevel registry hack, but I'd really prefer to get it working on the server. Does anyone have any experience with this? Or know where to start looking? The samba logs don't reveal much. EDIT: Here's what the wb-MYDOMAIN log says when I attempt to authenticate: [2009/08/20 15:13:36, 4] nsswitch/winbindd_dual.c:fork_domain_child(1080) child daemon request 13 [2009/08/20 15:13:36, 10] nsswitch/winbindd_dual.c:child_process_request(478) process_request: request fn AUTH_CRAP [2009/08/20 15:13:36, 3] nsswitch/winbindd_pam.c:winbindd_dual_pam_auth_crap(1755) [ 4127]: pam auth crap domain: MYDOMAIN user: MYUSER [2009/08/20 15:13:36, 0] nsswitch/winbindd_pam.c:winbindd_dual_pam_auth_crap(1767) winbindd_pam_auth_crap: invalid password length 24/282 [2009/08/20 15:13:36, 2] nsswitch/winbindd_pam.c:winbindd_dual_pam_auth_crap(1931) NTLM CRAP authentication for user [MYDOMAIN]\[MYUSER] returned NT_STATUS_INVALID_PARAMETER (PAM: 4) [2009/08/20 15:13:36, 10] nsswitch/winbindd_cache.c:cache_store_response(2267) Storing response for pid 4547, len 3240

    Read the article

  • Why every change in Asus UX50v causes screen resolution get lost?

    - by Kaveh Shahbazian
    Why every change in Asus UX50v causes screen resolution get lost? Installing a new application, connecting to another wireless network, change some settings, ... causes this problem. For example after installing an application, UX50v needs to restart. And when it got restarted, resolution would be set to 640x480(or 600x800) and Hibernate and Sleep options are disappeared from shutdown menu! (I have other problems with this Asus UX50v crap too - like I can't update windows 7 because it crashes on Asus UX50v crap - but this one is absolutely ridiculous and stupid!)

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >