Search Results

Search found 28300 results on 1132 pages for 'david back'.

Page 588/1132 | < Previous Page | 584 585 586 587 588 589 590 591 592 593 594 595  | Next Page >

  • Windows HTTP proxy client to pass service requests to VPN

    - by Chris
    I've got access to a network via CheckPoint VPN (Windows client). Problem is, I have a linux box that needs to talk to its web services and the target web servers are inside the VPN. So far, we have been unable to connect linux to the VPN (and I'm not trying to solve that problem at the moment). I'm wondering if (temporarily) I can setup a proxy server on a Windows (XP) box to shuttle HTTP requests back and forth? If so, what'd be a good application to do this? (hopefully free/open-source) TIA

    Read the article

  • Seeking .htaccess help: Converting multiple subdomains (both HTTP and HTTPS) to www.domain.com using .htaccess

    - by Joshua Dorkin
    I've been trying to get an answer to this question on other forums (the folks at SuperUser thought this was the place I needed to post) and via my connections, but I haven't gotten very far. Hopefully you guys can help me find an answer. I've got a dozen old subdomains that have been indexed by Google. These have been indexed as both HTTP AND HTTPS. I've managed to redirect all the subdomains properly, provided they are not HTTPS, but can't get any of the HTTPS subdomains to property redirect. Here's the code I'm using: RewriteCond %{HTTP_HOST} ^subdomain1.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain2.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain3.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] This works great until someone goes to: https://subdomain2.mysite.com$ which is not redirected back to http://www.mysite.com$ How can I get this to work? Additionally, I'm guessing there is an easier way to make it happen than setting up a dozen pairs of RewriteCond/RewriteRule? Is there any way to do this in just a few lines, including one where I list all the subdomains? I'd actually also like to redirect everything on https://www.mysite.com$ to http://www.mysite.com$ except for 3 folders. These are mysite.com/secure, mysite.com/store, mysite.com/user. Is there any good way to add this to the .htaccess file?

    Read the article

  • WordPress is now nicely supported on SQL Server (and SQL Azure for that matter)

    - by Eric Nelson
    WordPress is enormously popular for blogs and full websites thanks to an awesome eco system which has built up around it, the simplicity (relatively) of getting it up and running plus the flexibility to “bend it” in all sorts of directions. When I say bend, check out the following which are all WordPress sites My “back up blog” http://iupdateable.wordpress.com/  My groups “odd site” :) http://ubelly.com My favourite “cheap games” site http://www.frugalgaming.co.uk/  WordPress users typically run their sites on Linux and MySQL, although PHP (the language in which WordPress is written) can be happily run on Windows. Both fine technologies in their own right, but for me (and probably a fair few others) I would love to use WordPress but with the technologies I know best (aka Windows, IIS and SQL Server). However, that has proven to be actually rather tricky in practice to get working – until now. Earlier last month OmniTI released a patch for WordPress which provides SQL Server and SQL Azure support.  In parallel with that some fine folks inside Microsoft have also created http://wordpress.visitmix.com which contains information about running WordPress on the Microsoft platform with a particular focus on SQL Server and SQL Azure.  Top stuff! To run WordPress with SQL Server: Download and Install the WordPress on SQL Server Distro/Patch And then you will quite likely need to migrate: Check out how to Migrate to Windows and SQL Server by Zach Owens who is moving his blog to Windows and SQL Server Enjoy Related Links Running PHP on IIS on Windows http://php.iis.net/  If PHP is not your thing, then the following Blog engines are .NET based BlogEngine http://www.dotnetblogengine.net/ DasBlog http://www.dasblog.info/ Subtext http://subtextproject.com/ (which happens to power http://geekswithblogs.net where my main blog is http://geekswithblogs.net/iupdateable)

    Read the article

  • Database hidden in SQL Server

    - by Colin Desmond
    During an aborted TFS import (2008 into 2010), I have managed to "lose" a database in 2008. The database is not visible in Management Studio, but the SQL Server exe has a handle on the .mdf file (according to UnLocker), it says it cannot attach it because the file is in use and it cannot attach a copy of the file (created when SQL Server was stopped), as it says a DB of the same name is already attached. Given I am using the same TFS admin account I have always used and have always been able to see the database in, why is this database missing and, more importantly, how do I get it back again?

    Read the article

  • Redhat doesn't set my desired hostname on reboot

    - by tomdee
    I have a redhat (EL5) server that I need to change the hostname on. I'm trying to put it back into a known state to help with server provisioning activities. As part of changing the hostname, I'm updating /etc/sysconfig/network and /etc/hosts. I also have an explicit call to hostname. My desired state is that the server thinks its hostname is "localhost". And a call to "hostname" returns "localhost". The problem I'm having is that when I reboot, the hostname is reverted to "localhost.companyname.com" which is not what I want. How do I ensure that the hostname is set up as just "localhost" when I reboot? My /etc/sysconfig/network file contains: NETWORKING=yes HOSTNAME=localhost GATEWAY=123.123.123.123 #I do have a proper IP address here My /etc/hosts file contains: 127.0.0.1 localhost.localdomain localhost 172.21.1.1 localhost.companyname.com localhost

    Read the article

  • Delay at Windows Log On Screen

    - by Ryan Elkins
    I'm getting a delay when entering my password at the log on screen in Windows 7. What I mean is that I can select the text box but pressing keys does nothing. I had this problem initially on this computer with Vista installed - when I reformatted and installed Windows 7 the problem went away - but over the weeks it is slowly coming back (the delay seems to be getting longer). Any clues as to why this is happening and how I can fix it? It currently takes about 4-8 seconds before it will accept keyboard input.

    Read the article

  • Gems In The Visual Studio 2010 Training Kit - Introduction to ASP.NET MVC: Learning Labs

    - by Jim Duffy
    Following up on my prior “gems post” is another nugget I found in the Visual Studio 2010 and .NET Framework 4 Training Kit. ASP.NET MVC has established quite a bit of momentum in the ASP.NET development community since it was introduced in early-ish 2009 though I’m sure there are many developers who haven’t had the time or opportunity to find out what it is, not to mention learn how to use it. If you’re one of those “I’ve heard of it but I’m not sure what it really is” developers then I suggest you start your research here. Ok, back to the gem. There are a number of fantastic MVC learning resources out there including the video tutorials on the ASP.NET MVC website. Another learning resource for your journey along the yellow brick road into ASP.NET MVC land are the hands-on learning labs contained in the Visual Studio 2010 and .NET Framework 4 Training Kit. These hands-on exercises walk you through the process of creating the “M”, the “V”s, and the “C”s of ASP.NET MVC and help you gain a solid foothold into the details of creating and understanding ASP.NET MVC applications. Have a day. :-|

    Read the article

  • Refactoring this code that produces a reverse-lookup hash from another hash

    - by Frank Joseph Mattia
    This code is based on the idea of a Form Object http://blog.codeclimate.com/blog/2012/10/17/7-ways-to-decompose-fat-activerecord-models/ (see #3 if unfamiliar with the concept). My actual code in question may be found here: https://gist.github.com/frankjmattia/82a9945f30bde29eba88 The code takes a hash of objects/attributes and creates a reverse lookup hash to keep track of their delegations to do this. delegate :first_name, :email, to: :user, prefix: true But I am manually creating the delegations from a hash like this: DELEGATIONS = { user: [ :first_name, :email ] } At runtime when I want to look up the translated attribute names for the objects, all I have to go on are the delegated/prefixed (have to use a prefix to avoid naming collisions) attribute names like :user_first_name which aren't in sync with the rails i18n way of doing it: en: activerecord: attributes: user: email: 'Email Address' The code I have take the above delegations hash and turns it into a lookup table so when I override human_attribute_name I can get back the original attribute name and its class. Then I send #human_attribute_name to the original class with the original attribute name as its argument. The code I've come up with works but it is ugly to say the least. I've never really used #inject so this was a crash course for me and am quite unsure if this code effective way of solving my problem. Could someone recommend a simpler solution that does not require a reverse lookup table or does that seem like the right way to go? Thanks, - FJM

    Read the article

  • My search what the Cloud will mean for my Work

    - by Kay Sellenrode
    Since I finished my MCM Exchange 2007 training back in April 2009 I’m struggling with the Cloud. I know it will change the way we do things today, but how will it affect my work. My work is Exchange consultancy mostly in the Netherlands, but more and more across the globe.   In my job as a consultant I noticed last year that a large percentage of my customers showed interest in the cloud services available today. But in most situations it seemed that it wasn’t the right time for them to switch to a cloud service at this moment. Right now I’m helping one of my customers is exploring Exchange online and it looks like they will switch over from their on-premise Exchange solution. This made me more than ever realize that I need to do something to not miss the boat.     With Office 365 coming this year, my idea is that Cloud services will take off from now. Also I’m sure that quite some customers will expect me to help them with their decision between the cloud and the on premise solution. So in the next months I will explore all the possibilities of Office 365, but also some of the competition in this field.   In my search for what the cloud will mean for me and my customers, I will go over all the aspects of the offered solutions. Any help in my search is always welcome. I’m looking forward to ideas people have around the cloud and how it will change the IT environment, especially in the Unified communications field.   Next week I will post my first article about my experiences with the cloud until now.

    Read the article

  • When does a Project Manager start in a project?

    - by johndoucette
    From a colleague of mine… “As a project manager, when do you typically like to get initially involved in the project? Is it better for the PM to be rolled on during the project kick-off, the first week, or is it better to roll-on the second week when things settle down?” My textbook answer is “the Project Manager is responsible for the successful completion and delivery of the expected outcome of the project through the following major tasks;” 1.    Identifying requirements 2.    Establishing clear and achievable objectives 3.    Balancing the competing demands for quality, scope, time, and cost 4.    Adapting the specifications, plans, and approach to the different concerns and expectations of the various stakeholders However; My colleague is often a lead technical consultant coming into a project alone to help a client solve a complex problem. As Magenic consultants, we all possess many of the “project managing” skills I talked about above and tend to be responsible for item #1 and #2 as well as the actual architecture/design tasks early in a project. When the real development begins and there is no PM involved, the project will quickly get harder to execute unless items #3 & #4 are assigned to a Project Manager. In software development, the concept of context switching between coding and other administrative activities is the hardest skill perfect. In my experience, I have rarely been introduced to someone who has mastered this skill. This is the limbo I was in when I was asked to become a PM -- while still developing. “Put down the code” was not only a profound statement, but looking back – a necessary one. Unless you are lucky to have found that one developer who is a superman, asking your developers (internal corporate or consultant) to perform #3 and #4 tasks, will surely take more time, allow opportunity for more scope, and eventually cost more. Project Managers are crucial to the overall success of a project, and I prefer them to start by taking ownership of delivery on day one.

    Read the article

  • Joining H264 *without* re-encoding

    - by jdmuys
    Hi, I have two halves of a single show in two .MP4 files, encoded in H264. I would like to join them without re-encoding. Is this possible? I managed to create a joined video as a Quicktime file (.mov) using Quicktime Pro, but then Quicktime Pro will not convert it back to .MP4 without re-encoding. This may be because looking inside the .mov file, the two H264 videos are in there still separated as individual "objects". I am also struggling with MPEG StreamClip without reaching a real solution. But I may have missed something. Note that I have the same issue with MPEG2 files. I can export them to a .MPEG container or a .TS file for example, but I don't know how to join them without re-encoding. Any suggestion welcome, preferably using Mac software.

    Read the article

  • Fail to upgrade from 10.10 to 11.04

    - by Ana Solís
    I was using Natty for a while, but while updating to the new release there was a blackout and it wasn't able to finish and Ubuntu failed to load after that. I thought, no worries I got my files backed up and I still got my 10.10 CD I used to put Ubuntu in my computer in the first place. So I installed it again, with the plan of using the update manager to get myself with the current release... Except I get this error: W:Failed to fetch http://extras.ubuntu.com/ubuntu/dists/natty/main/source/Sources.gz 404 Not Found , W:Failed to fetch http://extras.ubuntu.com/ubuntu/dists/natty/main/binary-amd64/Packages.gz 404 Not Found , E:Some index files failed to download, they have been ignored, or old ones used instead. My internet connection is just fine, seeing as I'm able to post this, but I don't know what else to do. Tried to download Quantal in another computer and putting it on a DVD (since it won't fit in a CD...) and the stupid thing fails to load it, it skips it over and goes right back to Maverick... (not a faulty disk, it installed Ubuntu just fine in a friends computer...)

    Read the article

  • I need a few minutes of dedicated server a week, but not for hosting, just to convert ogg etc.

    - by talkingnews
    I'm completely happy with my webhosting, it's just that I need to do one little thing they won't allow, and that's run an instance of Sox to convert about 30 mp3s to ogg files, in various directories, a couple of times a week, to be done automatically in response to the detection of the upload of an mp3. Probably looking at a minute of server time over the whole week. I've had unhelpful suggestions on other forums like "why not leave your home PC on 24 hours a day and then use all your isp bandwidth to do this", which doesn't work for me. I know that I can host files on, say, Amazon S3, but is there something similar for my needs? All it would need to do would be: wget/ftp the mp3 files, convert them to ogg, ftp the files back to my hosting. Of course, all this wouldn't be needed if there was such a thing as a compiled binary of Sox (or any mp3ogg converter) for Centos which I could upload without needing root access, but I've given up asking that one, but always open to suggestions!

    Read the article

  • Different file locations for http v https on IIS?

    - by Jeremy Morgan
    We have a server running IIS and have some folders running under https, but most are open. The problem I'm having is when someone is directed from a page in the secure section of the site, the relative link brings up https. For example: link to /pictures goes to http://www.mysite.com/pictures But if someone is on a secured part of the site https://www.mysite.com/shoppingcart And then clicks back to /pictures, they get https://www.mysite.com/pictures so the pictures directory is shown under https. My problem is, they get a 404 not found message when this happens. I could not find anything in the settings that would indicate that secured connections are pulling files from anywhere different than non-secured. If I type http or https on the main page of the site both come up fine. But if I try to add the https:// in a folder level, I get a 404. Any ideas why this might be happening?

    Read the article

  • Fight for your rights as a video gamer.

    - by Chris Williams
    Soon, the U.S. Supreme Court may decide whether to hear a case that could have a lasting impact on computer and video games. The case before the Court involves a law passed by the state of California attempting to criminalize the sale of certain computer and video games. Two previous courts rejected the California law as unconstitutional, but soon the Supreme Court could have the final say. Whatever the Court's ruling, we must be prepared to continue defending our rights now and in the future. To do so, we need a large, powerful movement of gamers to speak with one voice and show that we won't sit back while lawmakers try to score political points by scapegoating video games and treating them differently than books, movies, and music. If the Court decides to hear the case, we're going to need thousands of activists like you who can help defend computer and video games by writing letters to editors, calling into talk radio stations, and educating Americans about our passion for and appreciation of computer and video games. You can help build this movement right now by inviting all your friends and fellow gamers to join the Video Game Voters Network. Use our simple tool to send an email to everyone you know asking them to stand up for gaming rights: http://videogamevoters.org/movement You can also help spread the word through Facebook and Twitter, or you can simply forward this email to everyone you know and ask them to sign up at videogamevoters.org. Time after time, courts continue to reject politicians' efforts to restrict the sale of computer and video games. But that doesn't mean the politicians will stop trying anytime soon -- in fact, it means they're likely to ramp up their efforts even more. To stop them, we must make it clear that gamers will continue to stand up for free speech -- and that the numbers are on our side. Help make sure we're ready and able to keep fighting for our gaming rights. Spread the word about the Video Game Voters Network right now: http://videogamevoters.org/movement Thank you. -- Video Game Voters Network

    Read the article

  • Virtualised Sharepoint Backup Strategies

    - by dunxd
    I have a Sharepoint (OSS 2007) farm running on three virtual machines in VMWare ESX, plus a SQL Server backend on physical hardware. During a recent Business Continuity Planning event I tried restoring the sharepoint farm with only the config and content databases, and failed to get things working. My plan was to build a new sharepoint server, then attach this to a restoration config database and install the Central Management site on this server, then reattach the content databases. This failed at the Central Management part of the plan. So I am back to the drawing board on the best strategy for backup and recovery, with reducing the time and complexity of the restore job the main objective. I haven't been able to find much in the way of discussion of backup/restore strategies for Sharepoint in a VMWare environment, so I figured I'd see if anyone on server fault has any ideas or experience.

    Read the article

  • Weird problem, special characters coming after formatting the hard drive, computer isn't starting 0_

    - by m3taspl0it
    Hya friends , Last night my computer was working fine. But today when i came back from college and started it , it was starting fine but after sometime it's getting restarted , again and again getting restarted at different points , so i tried to boot it in safe mode but same problem. Now after all this , i finally decided to format the drive C (it is in 80 GB) and load new OS windows XP3. After formatting (quick format) and loading xp3 sufficient files , when it was getting rebooted for copying the actual os files , it hung and a weird screen came. I've also attached the pic of error : http://www.postimage.org/image.php?v=gxz1vKS My specs : P4 3.0 Ghz 2 GB RAM (2x 512 mb and 1 GB) 3 hard drives { 80 GB (5 years old around) 320 GB ( 2 years old around) 500 GB ( recently bought) 256 MB graphics card any help is very appreciated , thanks

    Read the article

  • Execute background program in bash without job control

    - by Wu Yongzheng
    I often execute GUI programs, such as firefox and evince from shell. If I type "firefox &", firefox is considered as a bash job, so "fg" will bring it to foreground and "hang" the shell. This becomes annoying when I have some background jobs such as vim already running. What I want is to launch firefox and dis-associate it with bash. Consider the following ideal case with my imaginary runbg: $ vim foo.tex ctrl+z and vim is job 1 $ pdflatex foo $ runbg evince foo.pdf evince runs in background and I get me bash prompt back $ fg vim goes foreground Is there any way to do this using existing program? If no, I will write my own runbg.

    Read the article

  • How to set up an SSL Cert with Subject Alternative Name

    - by Darren Oster
    To test a specific embedded client, I need to set up a web server serving a couple of SSL (HTTPS) sites, say "main.mysite.com" and "alternate.mysite.com". These should be handled by the same certificate, with a Subject Name of "main.mysite.com" and a Subject Alternative Name of "alternate.mysite.com". This certificate needs to be in an authority chain back to a 'proper' CA (such as GoDaddy, to keep the cost down). My question is, are there any good tutorials on how to do this, or can someone explain the process? What sort of parent certificate do I need to purchase from the CA provider? My understanding of SSL certificates is limited, but as Manuel said in Fawlty Towers, "I learn...". I'm happy to work in Windows (IIS) or Linux (Apache) (or even OSX, for that matter). Thanks in advance.

    Read the article

  • problems with WindowsImageBackup and write protected drives

    - by Ralph Shillington
    On Windows 7 I created a System image of my computer (C: and reserved partition) onto a USB drive. No problem. I then formatted the C: and installed the OS -- no problem Now I would like use the System Image and get back some of my documents etc. But I can't get access wo the WindowsImageBackup folder on the USB drive 1) Somehow the drive is write protected --- how did that happen? How do I unprotect that drive. 2) I can't access the WindowsImageBackup folder because I suspect the ACL is out of wack with my new SID. I would add my new SID to the ACL but I can't because the drive is write protected At the moment I'm completely disconnected from my files, which I thought (and still hope) are backed up. Understandably, panic is now setting in.

    Read the article

  • Microsoft , Hotmail , Live , MSN, Outlook , unable to send emails and no support received from microsoft in 3 months we are trying asking for that

    - by HugeNut
    Ok this is somenthing unbelievable, we have a website, users sign up and receives links to confirm they signed up BUT: 1 - microsoft blocked our IP (no one with microsoft email account can receive our emails) 2 - we tryed contacting microsoft submitting the detailed form about our problem 3 - we posted 3 times in their community about our problem 4 - we tweeted they about our problem 5 - we tryed finding out some telephone support number (the few there are arent' helping at all) Do you think we solved? the answer is NO :/ We still unable to send emails from our IP to microsoft email accounts, since 3 months back. Our emails are perfect we checked all the email headers following microsoft guidelines but it seems not enought, checking our IP reputation it seems everythings ok, indeed we can send email easly to any other provider , gmail, yahoo, etc Do you know any other way to try to get help ? FULL ERROR RETURNED BY MICROSOFT: host mx1.hotmail.com[65.55.37.120] said: 550 SC-001 (COL0-MC4-F28) Unfortunately, messages from 94.23.***** weren't sent. Please contact your Internet service provider since part of their network is on our block list. You can also refer your provider to http://mail.live.com/mail/troubleshooting.aspx#errors. (in reply to MAIL FROM command) We are running NGIX + php mailer from a Virtual Private Server (No Hosting or shared hosting)

    Read the article

  • Strange Inode/Ram cache drops happening in CentOS

    - by FunkyChicken
    I run a CentOS 5.7 machine (64bit) with 24GB ram and 4x SAS drives in RAID10 setup. This machine runs nginx/1.0.10, php-fpm & xcache. About a month back the RAM usage of this machine has changed. About every few hours the 'CACHE' is flushed from the RAM, this happens exactly when the 'Inode table usage' drops. I'm pretty sure these drops are related. (see the 2 attached images). This server hosts quite a lot of small files (20M all a few KB big). Not many files are deleted (maybe 100 per hour (total size a few MB max)), not enough to account for the huge Inode table drops. I also have no crons running which could cause these drops. Sar -r output: http://pastebin.com/C4D0B79i My question: Why are these huge RAM/Inode usage drops happening? How can I get Nginx/PHP to use all of my servers RAM?

    Read the article

  • Learn How Ancestry.com Helps Families Uncover Their History with Oracle WebCenter

    - by Christie Flanagan
    Delivering Exceptional Online Customer ExperiencesAncestry.com is the world’s largest online family history resource, providing an engaging and interactive customer experience to more than 1.7 million members. With smart search technology, a wealth of learning resources, and a worldwide community of family history enthusiasts, Ancestry.com helps people discover their roots and tell their unique family stories. Key to Ancestry.com’s success has been the delivery of an online customer experience that converts site visitors into paying subscribers and keeps them coming back. To help achieve this goal, Ancestry.com turned to Oracle’s Web experience management solution, Oracle WebCenter Sites. Join us as executives from Ancestry.com and Oracle discuss how Oracle’s Web experience management solution is helping them deliver engaging online experiences. Learn how: Ancestry.com selected Oracle WebCenter Sites to meet their demanding Web experience management requirements The company was able to get up and running quickly despite a complex technology stack and challenging integration requirements with legacy systems Ancestry.com empowered business users to manage the online experience and significantly reduce time to market for their online campaigns and initiatives Register now for the Webcast. REGISTER NOW Thursday,June 28, 201210 a.m. PT / 1 p.m. ET Presented by: Blane Nelson Chief Architect–Applications,Ancestry.com Christie FlanaganDirector of Product Marketing, Oracle WebCenter Sites,Oracle

    Read the article

  • confusion in attaching detaching screen on Ubuntu

    - by Registered User
    Hi, screen -list shows There are screens on: 9531.pts-0.ubuntu (03/02/2011 12:43:34 PM) (Detached) 2101.pts-0.ubuntu (03/02/2011 12:39:17 PM) (Attached) 2219.pts-0.ubuntu (03/02/2011 11:20:56 AM) (Attached) 3 Sockets in /var/run/screen/S-bond. but when I type screen -r 2101.pts-0.ubuntu There is a screen on: 2101.pts-0.ubuntu (03/02/2011 12:39:16 PM) (Attached) There is no screen to be resumed matching 2101.pts-0.ubuntu. Here I can not get back the the screen 2101.pts-0.ubuntu and infact I get exited. Where as if I do screen -r 9531.pts-0.ubuntu [detached from 9531.pts-0.ubuntu] then above you can see I went inside that session and came out and I can do it again and again.But with other sessions same is not the case? SO what mistake am I doing?

    Read the article

  • Hybrid Graphics on Windows 7/Ubuntu 12.04 Dual Boot

    - by Noob.
    Alright, so here's the situation: I am using an ASUS UL80VT with two graphics cards: Integrated intel graphics and NVIDIA G210M I was running an Ubuntu 12.04 - Windows 7 dual boot (on separate partitions).The machine worked perfectly (including the display drivers) without me needing to install anything special or change any settings. However, my hard drive was corrupted and I lost all my data yesterday, so after it was replaced, I installed Ubuntu 12.04 64x again after installing Windows 7. I booted up Ubuntu after installation, and noticed it was by default using Unity 2D... Gnome 3.4 wasn't working properly either, so I guessed that the NVIDIA G210M driver wasn't installed/working and the OS was instead using the integrated graphics. I checked the "Additional Drivers" thing, but there were no proprietary drivers listed there, so I went to the NVIDIA website, downloaded the driver directly and installed it. I restarted, but there was no change. After this, I read somewhere that I should change my SATA in the BIOS to "Compatible" rather than "Enhanced". This worked fine and fixed the problem (both Unity and Gnome were working perfectly) but then when I tried booting up Windows 7, I recieved the BSOD. So I changed it back to Enhanced, and once again, the NVIDIA 210M graphics isn't working on Ubuntu, but on Windows 7 it is. I do not want to keep changing from Enhanced to Compatible every time I reboot to Ubuntu and neither do I want to simply just use one OS. Note that NVIDIA 210M and integrated graphics work perfectly on Windows 7. Also, I don't care about switching between them, I just want to be able to use the NVIDIA one. What can I do so that both Windows 7 and Ubuntu work and NVIDIA G210M works on Ubuntu?

    Read the article

< Previous Page | 584 585 586 587 588 589 590 591 592 593 594 595  | Next Page >