Search Results

Search found 19525 results on 781 pages for 'say'.

Page 574/781 | < Previous Page | 570 571 572 573 574 575 576 577 578 579 580 581  | Next Page >

  • How do I restart a Windows XP upgrade?

    - by Jason
    Is there a registry tweek to tell Windows Setup to start over? It tries to continue where it left off after I reboot. I can get to the Recovery Console. I tried to go from SP2-SP3. It failed, and I couldn't get to Safe Mode. I put in the SP1 disk (I don't have an SP2 boot disk, just the upgrade package.) It ran a couple minutes then gave me the error "the signature for windows xp professional upgrade is invalid" error code 800b0100. I rebooted to Safe Mode. I get to Safe Mode then say "Window XP Setup can't run under Safe Mode" press ok to restart. I put the SP3 disk back in, trying to get the "repair" option I didn't ever see putting in the SP1 disk, and it tried to continue the SP1 install - on the 4th step, and then gave the same signature error above. I need to get it to start over, so I can get to the repair option, to go back to SP2 (or install SP1 then add SP2 to it). Is there a registry tweek to tell Windows Setup to start over?

    Read the article

  • Apple Mac OS X Mavericks inside Virtual Box

    - by John Sonderson
    I have a few questions regarding Mac OS X and OS virtualization. A. Given all the legal restrictions imposed by Apple on Apple products, I would like to know whether it is legal to install the new and freely downloadable Apple Mac OS X Mavericks inside Oracle Virtual Box. B. What about older OS X versions such as Mountain Lion? C. How many machines can I install it on. What if I don't use the Mac to which the OS is downloaded but only use it on Virtual Box, and prefer to, say, install Linux on the Mac computer so that it doesn't become unusable due to the single user license policy and me running the OS on Windows 7 within Virtual Box? D. I have a PC running Windows 7 but would like to get the OS off a second had Mac I'm planning to purchase for this purpose. How must we proceed to copy the OS to an ISO so that I can install it on Windows 7? I am unfamiliar with Macs and do not know what software to use for the purpose, nor where the OS is downloaded to (as an ISO, .app executable, .gz or .zip file or whatever). If anyone could provide some guidance with the process I would sincerely appreciate it. Thanks.

    Read the article

  • Recovering a damaged microSDHC

    - by djechelon
    I just bought from eBay a Kingston 32GB microSDHC that was advertised as defective. The seller said that there could be formatting problems or with transfer of large files. Unfortunately, when I got it, it was a total mess. My Nikon camera doesn't read it at all (OK, maybe it doesn't support 32GB) My Linux laptop doesn't mount it: can't read superblock The same laptop refuses to mkfs.msdos because it failed whilst writing reserved sector The same laptop, under Windows, doesn't read nor format the card HTC HD2 mounts the MMC, allows me to write via USB, but is unable to open the just written files OK, folks, now you would say I would have to go through Paypal complaint... that's not that easy. I consciously bought a half-price card that was known to show some defects, and Paypal complaints take time. Obviously, I can't accept somebody sold me a completely use-less computer decoration. So I'll keep it as last option. My question is Do you know a way, under either Linux or Windows, to thoroughly scan, test and possibly repair memory cards, even if I have to lose some percentage of space because of bad sectors? If I can keep at least half of the card intact it would certainly be fine. I used to do broken sector marking with hard disks in the past. I almost forgot: MONSTR:/home/djechelon # fsck /dev/mmcblk0p1 fsck from util-linux-ng 2.17.2 dosfsck 3.0.9, 31 Jan 2010, FAT32, LFN Read 512 bytes at 0:Input/output error

    Read the article

  • Openfiler crashing without cause or leaving any log messages

    - by user44725
    So my linux machine keeps crashing, without so much as a bye or a leave. I've tried and tried and failed again to work out whats happening. Any help would be much appreciated. Linux chai 2.6.29.6-0.24.smp.gcc3.4.x86_64 #1 SMP Tue Mar 9 05:06:08 GMT 2010 x86_64 x86_64 x86_64 GNU/Linux Openfiler Here is what the /var/log/messages file says at the time of the latest crash. Nothing that unusual - just greg logging in and out via samba. You'll notice there is a cron running for root every minute - ignore this - this isn't the issue either it was some check I've been doing to discover its problem. Jun 2 10:32:01 chai crond(pam_unix)[16529]: session closed for user root Jun 2 10:32:49 chai samba(pam_unix)[15454]: session opened for user greg by (uid=0) Jun 2 10:33:01 chai crond(pam_unix)[16537]: session opened for user root by (uid=0) Jun 2 10:33:04 chai crond(pam_unix)[16537]: session closed for user root Jun 2 10:41:40 chai syslogd 1.4.1: restart. Jun 2 10:41:43 chai syslog: syslogd startup succeeded That restart was called manually by hand - by clicking the restart button on the box. So basically messages isn't revealing many secrets. dmesg only shows from startup. If there is any output I should paste. Just say when and where and it'll be done. Thanks for your help! Tim

    Read the article

  • Add IPv6 support to DirectAdmin server

    - by George Boot
    I just set up an new DirectAdmin, and I want to prepare it for IPv6 use. My ISP have gave me an range of IPv6 addresses that I can use. Lets say that address is 2a01:7c8:**:1f::. My neworkadapter user DHCP to resolves its IP-addresses. When i type ifoncig eth0 I get the following result: eth0 Link encap:Ethernet HWaddr 52:**:**:**:ce:f3 inet addr:37.**.**.44 Bcast:37.**.**.255 Mask:255.255.255.0 inet6 addr: 2a01:7c8:****:1f::/64 Scope:Global inet6 addr: fe80::5054:ff:fe87:cef3/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:38941 errors:0 dropped:0 overruns:0 frame:0 TX packets:29439 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:3779534 (3.6 MiB) TX bytes:5089379 (4.8 MiB) As you can see, I have an IPv6 address set, but I can't ping6 an IPv6 host. I get the error: connect: Network is unreachable. I decided that I needed an gateway, so I tryed to add one: ip -6 route add default via 2a01:7c8:****::1 dev eth0 (2a01:7c8:**::1 is the gateway of my ISP). But it trows an error: RTNETLINK answers: No route to host. Does somebody know what to do, and how to solve this issue? Thanks a lot!

    Read the article

  • Linux DHCPD Mac-Address based Groups

    - by GruffTech
    Our Current DHCPD.conf looks like the following. subnet 10.0.32.0 netmask 255.255.255.0 { range 10.0.32.100 10.0.32.254; option subnet-mask 255.255.255.0; option broadcast-address 10.0.32.255; option domain-name-servers 208.67.222.222,208.67.220.220; option routers 10.0.32.5; host Dev-ABaird-W { hardware ethernet 00:1D:09:3E:49:13; fixed-address 10.0.32.94; } ... more static hosts .... } About as basic as it gets. The old router is 10.0.32.1, our company wanted to implement a squid proxy to better monitor web traffic while at work, and if necessary block large time-wasters, IE Facebook.com. However, we've quickly realized that this change has played a mean prank on our Polycom SIP Phones. Occasionally our phones will not ring, the end recipient hears ringing (this is artificially created by our PBX) however the handset never rings. The ONLY thing that has changed in our network is the option routers line. So, Since all Polycom MAC addresses begin with 00:04:F2 would it be possible in DHCP to say any 00:04:F2:::* MAC addresses get option routers 10.0.32.1, and anything else must talk with our Gateway?

    Read the article

  • Can I replicate data between mySQL and SQL Server/SQL Azure?

    - by Ernest Mueller
    I have a replicated mySQL setup running happily on Amazon AWS, making user data available locally in various regions. Now I'm faced with an app that needs to go up on Microsoft Azure and I need to replicate the data over to there as well. So that's annoying. I am faced with several options: Replicate from mySQL to SQL Azure/SQL Server seems like it would be lovely - is this possible? I'd consider using a third party tool and paying $$ if I had to. We're not using anything complicated in the db feature set, it's just data in tables. Get mySQL working on Microsoft Azure - which seems really dicey at best. All the HOWTOs I can find say "this is possible but you really shouldn't try this for production apps." Go non-realtime and do syncs from mySQL to SQL Azure, which may be somewhat expensive and slower. Rip out all my mySQL on Amazon and use SQL Server there, which would make Baby Jesus cry. Has anyone gotten mySQL to SQL Azure/SQL Server replication or syncing working? Or have any other approaches (a NoSQL solution that replicates and might meet our but-we-need-to-join-some-tables needs that can easily be run on Amazon and Azure)?

    Read the article

  • "Server Unavailable" and removed permissions on .NET sites after Windows Update [closed]

    - by andrewcameron
    Our company has five almost identical Windows 2003 servers with the same host, and all but one performed an automatic Windows Update last night without issue. The one that had problems, of course, was the one which hosts the majority of our sites. What the update appeared to do was cause the NETWORK user to stop having access to the .NET Framework 2.0 files, as the event log was complaining about not being able to open System.Web. This resulted in every .NET site on the server returning "Server Unavailable" as the App Domains failed to be initialise. I ran aspnet_regiis which didn't appear to fix the problem, so I ran FileMon which revealed that nobody but the Administrators group had access to any files in any of the website folders! After resetting the permissions, things appear to be fine. I was wondering if anyone had an idea of what could have caused this to go wrong? As I say, the four other servers updated without a problem. Are there any known issues involved with any of the following updates? My major suspect at the moment is the 3.5 update as all of the sites on the server are running in 3.5. Windows Server 2003 Update Rollup for ActiveX Killbits for Windows Server 2003 (KB960715) Windows Server 2003 Security Update for Internet Explorer 7 for Windows Server 2003 (KB960714) Windows Server 2003 Microsoft .NET Framework 3.5 Family Update (KB959209) x86 Windows Server 2003 Security Update for Windows Server 2003 (KB958687) Thanks for any light you can shed on this.

    Read the article

  • Balancing internal services using a Cisco CSS 11501

    - by Ladadadada
    First, the background to the problem: I have a Cisco CSS11501 that I am using to load balance a few web servers. These web servers have two network interfaces, one internal and one external and we are sending the requests to the internal interface. We have the CSS configured to do NAT because our webservers need to see the client's IP address. Because the TCP packets hit the webservers with a source address on the Internet, the webserver tries to send the packet back to the client over the external interface and not through the load balancer. In order to stop these requests being sent back out to the Internet via the external interface, we added a routing rule on these boxes so that all traffic with a source address on the internet will use the load balancer as the gateway. This part works fine. What I would also like to to is use the CSS as a load balancer for internal services such as our MySQL slaves. When I do this, I run into a similar problem; the TCP connection goes from the web server to the load balancer and then from the load balancer to the MySQL slave but the CSS spoofs a source address of the original webserver. The MySQL slave then tries to send the response directly to the webserver via the internal network and not via the load balancer. The ideal solution would be to tell the CSS not to do source address spoofing on the internal network and only do it for requests originating on the Internet. Is this possible ? Failing that, is there a way of directing the load balanced traffic back through the load balancer while keeping the other traffic (say SSH) purely on the internal network ? Is there another way of using the CSS11501 to load balance internal services ?

    Read the article

  • Logmein does not work at home?

    - by Littlet-ENG
    I've been using logmein successfully for may situations and have had very good success. Our company has an Log me in Pro account. I have used this to share my desktop with customers. At work, I have had no problem with my laptop. At home, one program (solid-works) that I need to share with my customers, will not display the active screen. I spent 45 min on the phone with both the software for the cad system and logmein support with not help. I need help in narrowing down what the problem is on my computer. The support guys at Solid-works got another remote software to work, so its not the program. I can get the logmein to work at the office so its not the settings of the logmein pro account. The LMI people say its a setting on my computer.? -internet is fast enough at home -can't narrow down the problem -changed graphical settings and that didn't work. Any Suggestions?

    Read the article

  • Default Browser hangs (IE, Chrome)

    - by Craig Hinrichs
    IE was my default browser about three months ago when I started experiencing this issue. Intermittent hangs would occur when I would open a new main page or new tab to a site I know would be up. What I mean by a hang: The browser would open and say "Waiting for site " and do nothing more. If I closed the window and reopened it it would immediatly connect. Over time I would have to close and reopen the window to get to the page. This would happen to any page including Google. I finally got sick of it and started using chrome and I will never go back. I recently upgraded my anti-virus and now I am experiencing the same issue with Chrome. I use AVG for my antivirus. Empirically it seems if I don't make Chrome my default browser I don't experience the issue. I tested this theory for over two hours yesterday. Possible issues I have found this coudl be but not confirmed yet: MTU settings are not correct. I am infected but my antivirus has not caught it (unlikely but possible) ?? I would like to think this is related to my antivirus but I am unsure how to verify. I don't like the idea of killing my antivirus if #2 is a possibility. I am looking for tips on how I can trouble shoot possible issues. I am on Windows XP SP3 Thanks in advance.

    Read the article

  • Recommendations for hosting large videos

    - by Clinton Blackmore
    I recently created and put a 45-minute, 300 MB video file on my website and told a mailing list about it. Checking my site stats, I see that I've used 20% of my "unlimited" bandwidth for the month. As I want to be able to have several videos like this, clearly, I need to consider other options. The appeal to hosting files as my own site (aside from the supposedly unlimited disk space and bandwidth), is to be able to have control over the format, resolution, and quality of the video(s), as well as to ensure that it is clear that I'm the copyright holder (although the videos will be under a creative commons license). I find that for the screencasts I'm making, having a high resolution (say 3/4 of 1024 * 768) really makes seeing what is going on on the screen easier. It is also always a plus to not have the experience marred by advertisements. One more wrench to throw in is that while the videos are non-commercial, they do promote a club, and it seems that that falls afoul of some terms of services (especially for free services; while free is very nice, I will certainly consider putting up some money.) What recommendations do you have for (fairly) long, high-resolution videos? Should I look in depth at sites like YouTube and Vimeo, should I be considering a filesharing site [I have no qualms with someone downloading the entire video first -- I wouldn't want to watch 45 minutes in my browser!], hosting files with Bittorent (ugh -- I think that'd reduce my audience), or should I be looking into other web hosts (and if so, who?)

    Read the article

  • How can I find files added to the system within X minutes of a specific time?

    - by Jack W-H
    I have done a fresh install of Mac OS X Mountain Lion today on a new MacBook. Because this was a new install, when I finally got round to configuring some of my own developer things, I was surprised to find some app had installed a binary into /usr/local/bin - a single binary called galileod. Interestingly, I can't find anything online about galileod. I had only installed the bare minimum of software at this point. Looking in the file columns I can see Date Modified was 9th November 2012, but Date Added to the system was today at 17:01. It's now 10:20PM and I can't remember which software I was installing at that point. So how do I find out which other files were installed to the system within, say, 5 minutes either side of 17:01? EDIT: I found out what galileod was by running galileod --help - it is a binary used with Fitbit to communicate with the USB dongle. So that's the mystery solved - but it would still be interesting to know how to find files added within X minutes of a timeframe for future reference.

    Read the article

  • Can't set screen brightness in Gentoo system

    - by Real Yang
    My system: Linux gentoo 3.10.7-gentoo-r1 VGA compatible controller: NVIDIA Corporation GT216M [GeForce GT 240M] (rev a2) output of xbacklight: No outputs have backlight property output of xrandr: xrandr: Failed to get size of gamma for output default Screen 0: minimum 640 x 480, current 1280 x 720, maximum 1280 x 768 default connected 1280x720+0+0 0mm x 0mm 1280x720 0.0* 1024x768 61.0 800x600 61.0 640x480 60.0 1280x768 0.0 output of ls /proc/acpi: button/ event When I'm in kernel 3.8.13, I can change my brightness using xbacklight. I compiled 3.10.7-r1 using genkernel all. Before the upgrade I did get a notice of "compatible issues for Nvdia users" from emerge but I still don't know the details. It there anyway to let me set the brightness? Then i found a ebuild app-laptop/nvdiabl-0.81 and tried to emerege nvidabl, I got this message: Your kernel does not support FB_BACKLIGHT. To enable you it you can enable any frame buffer with backlight control or nouveau. Note that you cannot use FB_NVIDIA with nvidia's proprietary driver Please check to make sure these options are set correctly. Failure to do so may cause unexpected problems. Once you have satisfied these options, please try merging this package again. ERROR: app-laptop/nvidiabl-0.81::gentoo failed (pretend phase): Incorrect kernel configuration options Call stack: ebuild.sh, line 93: Called pkg_pretend nvidiabl-0.81.ebuild, line 31: Called linux-mod_pkg_setup linux-mod.eclass, line 559: Called linux-info_pkg_setup linux-info.eclass, line 911: Called check_extra_config linux-info.eclass, line 805: Called die The specific snippet of code: die "Incorrect kernel configuration options" [SOLVED] I enter the menuconfig again and check the Device Drivers -> Graphics support -> Support for frame buffer devices, then i found this: <*> nVidia Framebuffer Support [*] Support for backlight control (NEW) What can i say. Recompiling...

    Read the article

  • How to deal with the extremely big *.ost files in a Terminal Server environment which is running out of space

    - by Wolfgang Kuehne
    Our Terminal Server is running out of hard disk space, and the major files which occupy most of the space are *.ost files of the Outook, which come form the users which use the Terminal Server all the time through remote desktop. The Outlook is installed on the Terminal Server and various users can use it. What would be a solution in this case. Is there a way to limit the size of the *.ost files? I read in forums that having the Outlook 2010 set up in Cached Exchange Mode isn't the best practice for an environment where the hdd space is a major constraint. First thing that came to my mind is using folder redirection, and place the ost files (together with the AppData forlder) in a network share, but this does not help, because the ost files are saved a part of the AppData folder which can not be redirected. Then I thought if it is possible to limit the size of the ost file? Or limit the time that it keeps emailed cached, say just emails from the last 6 months are sufficient. Another solutions that came to my mind, moving the ost files somewhere else, this required the old ost file to be removed, and creation of a new one. I am not quite sure if the new OST file will still have cached the emails which where available in the old ost, or will it start caching from where the other one left. What do you suggest?

    Read the article

  • Resotre single users Exchange 2003 mailbox from backup

    - by Campo
    I take weekly backups of exchange in full. I also take complete weekly backups of the entire server. It is a Server 2003 R2 with AD and Exchange 2003 all on one box. One users inbox has disappeared. She has 19000+ junk items now. It is possible the inbox got mixed into the junk. Regardless it is such a huge mess she is not going to go through all of that.... I want to restore he mailbox from the backup. I followed this MS KB http://support.microsoft.com/kb/823176 I had to use Method 3. I have a VM of Server 2003 R2 with exchange but I am having failures on the restore from NT backup. The backup log just states to check the application log.... Application log points to backup log... Only info Is failed to restore Only thing different is the computer name... The only error I can find is in the Applicaiton log. Information Store Database not found All others just say that the backup failed. Any assistance is greatly appreciated.

    Read the article

  • How can I debug solutions in Visual Studio 2010 from a network share?

    - by alastairs
    I've recently got a new Mac laptop and am running VS2010 in a Parallels virtual machine. It's mostly working out well for me, but I'm having some problems with debugging specific project types, related to the fact that the projects are being accessed via a network share. Test projects don't run because the test runner can't load the tests' DLL. Web projects fail to run in the Visual Studio mini web server, throwing the following exception: 'An error occurred loading a configuration file: Failed to start monitoring changes to path\to\web.config'. I've spent the evening trawling the web with little luck on this. After reading these two posts, I tried out the usual CasPol changes, but then found this post from one of the early VS2010 betas indicating that CasPol is no longer needed/supported in .NET 4.0 and VS2010. The network share is accessible via both a mapped drive and the UNC path. The virtual machine runs its applications under the administrator account, which appears to have all the necessary permissions on the network share to create, read, write and delete files and folders. I say "appears to have" as I can't view the Security Properties of the appropriate folder via Explorer: the Security tab just isn't present. Has anyone managed to successfully load and debug web and test projects from a network share in VS2010?

    Read the article

  • How (in)secure are cell phones in reality?

    - by Aron Rotteveel
    I was recently re-reading an old Wired article about the Kaminsky DNS Vulnerability and the story behind it. In this article there was a quote that came across a little bit exaggerated to me: "The first thing I want to say to you," Vixie told Kaminsky, trying to contain the flood of feeling, "is never, ever repeat what you just told me over a cell phone." Vixie knew how easy it was to eavesdrop on a cell signal, and he had heard enough to know that he was facing a problem of global significance. If the information were intercepted by the wrong people, the wired world could be held ransom. Hackers could wreak havoc. Billions of dollars were at stake, and Vixie wasn't going to take any risks. When reading this I could not help but feel like it was a bit blown-up and theatrical. Now, I know absolutely nothing about cell phones and the security problems involved, but to my understanding, cell phone security has quite improved over the past few years. So my question is: how insecure are cell phones in reality? Are there any good articles that dig a bit deeper into this matter?

    Read the article

  • Is there any way to retire a AT&T Yahoo Email Account

    - by KindaSortaAsking
    Here are the facts as I (pretty sure) know them. Yahoo handles AT&T's DSL email accounts. I've called AT&T tech support and customer service and they say they can't help, but there has GOT to be a way to do this. It's too simple of a thing not to be able to do. I got behind on my dsl bill and my account got suspended. When I paid my bill, they said my account had been deactivated and I had to get a new account. When I tried to register my account with my old email address it would not let me, saying it was in use. I used a new email. The old email address is tied to a dsl account that can NEVER be reactivated. There has got to be a way to retire the old email address so that I can re-create it as a subaccount on my new dsl account. I'm not interested in anything that was in the old account (emails, addresses, etc) - I just want the address back.

    Read the article

  • How do you use a Mac without a mouse?

    - by NitroxDM
    So I was using my Mac Mini and decided to set the application button on Logic Click mouse to open Exposé. I had this set up at one time and found it useful. So I open the Exposé control panel and took a look at the triggers for Exposé. There is drop down for a secondary trigger. In that drop down there was a ton of mouse buttons 100+. So I selected #32 just to see what button #32 was on my 5 button mouse. Turns out it's all of them! So no matter what button I press guess what happens... I get Exposé. So using the just the keyboard how do I: Change the Exposé trigger? I can use the keyboard to bring up the control Exposé panel but guess what I can't get anything else to come in to focus. Kill Exposé? Right now I can say is D'oh! Any ideas?

    Read the article

  • Specific DNS sometimes resolves to wildcard, incorrectly

    - by Mojo
    I have an intermittent problem, and I'm not sure where to start trying to troubleshoot it. In our dev environment, we have two visible IP addresses on load balancers, one to the front-end, and one to a number of back-end service machines. The front-end is configured to take a wildcard DNS name to support generic "portals." dev.example.com A 10.1.1.1 *.dev.example.com CNAME dev.example.com The back-end servers are all specific names within the same space: core.dev.example.com A 10.1.1.2 cms.dev.example.com CNAME core.dev.example.com search.dev.example.com CNAME core.dev.example.com Here's the problem. Periodically a developer or a program trying to reach, say, cms.dev.example.com will get a result that points to the front-end, instead of the back-end load balancer: cms.dev.example.com is an alias to core.dev.example.com core.dev.example.com is an alias to dev.example.com (WRONG!) dev.example.com 10.1.1.1 The developers are all on Mac OS X machines, though I've seen the problem occur on an Ubuntu machine as well, using a local cloud host DNS resolver. Sometimes the developer is using a VPN, which directs the DNS to its own resolver, and sometimes he's on the local net using a DNS resolver assigned by the NAT router. Sometimes clearing the Mac OS X DNS cache, logging into the VPN, then logging out of the VPN, will make the problem go away. The origin authoritative server is on zerigo, and a dig directly to their name servers always seems to give the correct answer. The published DNS cache time for these records is 15 minutes, but the problem has been intermittent for about a week. Any troubleshooting suggestions?

    Read the article

  • What is the quickest reliable way to backup a NAS drive to a USB drive?

    - by Tim Murphy
    How would you backup 600+ GB of data on a NAS (Network-Attached Storage) drive to a USB external drive? The NAS drive does not contain mission critical data nonetheless I wish to make weekly copies of it just in case. The NAS drive is almost exclusively used as an archive dump and is rarely updated. However the backup strategy used must have a simple restore procedure so I can confidently say the data now on the NAS drive is exactly how it was at the time of backup. I did try xcopy but seemed like it would take many-many hours and eventually crashed with insufficient memory. http://www.ctunion.com/node/114 suggests I would need to use xxcopy instead due to folder/file name lengths. My concern with xcopy/xxcopy is the length of time it takes. Hoping something else is faster. NAS drive is DLink DNS-313. 1TB drive installed. Connected to router via Ethernet cable. USB drive is Seagate 1TB. Can be connected to Windows Vista (preferred) or Windows 7 PCs. Both PCs are usually connected Wirelessly however ethernet cable can be used during backup to speed up the process.

    Read the article

  • Epson Artisan 800 on Ubuntu/Linux

    - by Tim Lytle
    Update for Ubuntu 10.04: Printing should work 'out-of-box', scanning still needs the newer sane backend. Looking for a known good way to setup an Epson Artisan 800 on Ubuntu specifically or any linux box in general. It is a printer/scanner with ethernet/wifi/usb. I'd like to use it as a network printer/scanner being able to do both from my Windows and Ubuntu machines; however, if it needs to be physically connected to a computer (preferably the Ubuntu machine) that is doable (again, then sharing print/scan functions to the network). Basically, I'm looking for someone who has used this printer/scanner (or similar) in a multi-platform environment to share how the set it up and how well it worked. Updated: A little more information, like most printers (I expect) the documentation for the printer basically says, "don't use plug-n-play, run our setup CD from your Windows/Mac system", to do anything (set it up for network use even). I guess that's to make it easy for anyone else to setup, but when you're looking to use it with an unsupported (by Epson's documentation) OS, you're just stuck on your own. What I was hoping for was someone who could say, "Forget the bundled software, do [this] to set it up on wifi manually, install [this] to connect to the scanner from [os], printing works with [this] driver - at least that's how I set it up." I'll will (and have so far) use the information here, and post my own setup when I'm done, if there's no one else out there with that experience.

    Read the article

  • Improving browser performance while using lots of tabs?

    - by Andrew
    My browsing habits cause me to open lots of windows and tabs, either related to different projects I'm working on or things I may want to read later. I use OSX and use about 5 spaces with multiple windows in each space. The problem is eventually I'll have around 200 or more tabs open (spread over 15-20 windows) that I don't want to close. Needless to say, my computer's performance starts to degrade. As I write this on my mobile, Safari on my laptop is locking up the computer. I used to use Chrome but found better performance with Safari. What I'd like to know, is there a graph of browser performance based on tab usage? I don't need a browser that keeps all tabs active. It would be great if the browser could increase performance by "putting tabs to sleep". Or if there was some sort of tool for saving a "workspace" of tabs that you could reactivate the next time you are working on that project. What sort of solution can you recommend to solve this problem?

    Read the article

  • Win 7 dual monitor: Don't move application windows when turn off second monitor

    - by codewaggle
    The title is correct, it should say "Don't" not "Doesn't". I Don't want the application windows to be moved to the main monitor when I turn off the monitors. On my Win XP dual monitor system, I can turn off the monitors and when I turn them back on, the application windows are in the same locations on the same monitors as when I turned the monitors off. On the Win 7 system, every time I turn off the monitors (or just the second monitor), all of the application windows are moved to the "main" monitor. After experimenting with the settings, I've found one process that enables me to turn off the monitors and still keep my application windows laid out in my chosen locations on the two monitors: 1) Switch the display setting to a single monitor. 2) Turn off the monitors. 3) Turn on the monitors. 4) Switch the display setting back to "Extend these Displays". After step 4, the application windows that I had laid out on the second monitor are moved back to their original locations on the second monitor. Is there a windows or nVidia setting that would leave the application windows on the second monitor so that I don't need to switch the display settings every time I turn off the monitors? Specs: Windows 7 64-bit dual monitors (1 DP, 1 DVI) Desktop (most questions seem to be about laptops) nVidia Quadro 2000 nVidia Control Panel nVidia nView Control Panel

    Read the article

< Previous Page | 570 571 572 573 574 575 576 577 578 579 580 581  | Next Page >