Search Results

Search found 23762 results on 951 pages for 'network speed'.

Page 513/951 | < Previous Page | 509 510 511 512 513 514 515 516 517 518 519 520  | Next Page >

  • Can I backup my IMAP Gmail account locally using only Alpine?

    - by BasicObject
    I recently discovered cli email clients and have fallen in love with their speed and simplicity. After playing around with mutt and alpine I decided I favor alpine. I am a Gmail IMAP user and have many years of emails that I'd like to store locally. Is there a more or less convenient way to retain IMAP functionality and backup only the emails that haven't been backed up already on a weekly basis? I have alpine setup with my Gmail with IMAP and it's working great. I'm just wondering if there is a way to make an offline backup or "archive" locally on my computer while retaining the multi device access that IMAP offers. I apologize if this has been asked before, I did search for it and did not find my answer. Thank you for reading.

    Read the article

  • Graphic Setup tune-up checklist

    - by Click Ok
    I was trying to play the game Warzone 2100 and the games runs fine, with nice speed, but the screen stays with a horizontal lines "flickering"... My PC have a integrated GeForce Go 6100 vga. Ok, not a powerfull vga, but it's not the end of the world to run a "simple" game like this (compared with another games that ask you send your eyes to purchase a expensive vga). So, I think that the problem can be of the configuration of my machine. I use it in first instance for programming jobs, so I underpay attention to video setup. I would like about a checklist to know if my PC is "ready" to games. By example, I know that I need: Lastest vga drivers Updated DirectX and OpenGL What you suggest? There is too some good programs to test performance and suggests improvements in the system? Thank you! PS: I'm using Windows 7

    Read the article

  • ideal memory configuration 4 bank, ddr3, AM3+ FX - 1 vs 2 vs 4 dimms?

    - by TardisGuy
    Ok, so ive been looking around, trying to learn and understand the way that ram works. Ive gotten one answer that said "The addressing is best for 2 sticks, and when you use 4; it slows down" Another answer said something like: Theres bank/channel interleave that makes the memory read like one stick Also I read something about the memory density also being a factor. I dug further and found out that theres a higher speed limit on my board for 2 sticks vs 4, so now im trying to put an image in my head of how and why, and... pfft. Can anyone explain, or recommend a resource that would answer these questions?

    Read the article

  • TCP 30 small packets per second flood connection with server

    - by Denis Ermolin
    I'm testing connection with flash client and cloud server(boost::asio for software) over TCP connection. My connection with server already is really poor - 120 ms ping in average. I found when i start to send packets with 2 bytes size (without tcp header) with speed 30 packets/s - ping grow to 170-200 average. I think that it's really bad and my bad connection and bad cloud provider is reason for this high ping without any load. What do you think? (I tested my software - it can compute about 50k small packets/s so software is not a problem). I measure my ping through flash client - send packet with timestamp and immediatly send from server to client.

    Read the article

  • Request Multiple Maya Floating Server Licenses for extra Satellite clients

    - by Rob
    Hello all: I am currently setting up a 'render farm' for Maya 2008 Unlimited. One Maya workstation license comes with the ability to render on eight satellite nodes. It works perfect, the remote rendering works like a charm. However, we have additional boxes to set up as satellite rendering nodes, and we have extra Maya workstation licenses. Ideally, the workstation can take two licenses and thus render on 16 nodes, but I haven't been able to figure it out, or determine if it is actually possible. It's a big project, where rendering the entire thing is in the scope of weeks, so the speed up would be worth it. Any thoughts?

    Read the article

  • Kill program after it outputs a given line, from a shell script

    - by Paul
    Background: I am writing a test script for a piece of computational biology software. The software I am testing can take days or even weeks to run, so it has a recover functionality built in, in the case of system crashes or power failures. I am trying to figure out how to test the recovery system. Specifically, I can't figure out a way to "crash" the program in a controlled manner. I was thinking of somehow timing a SIGKILL instruction to run after some amount of time. This is probably not ideal, as the test case isn't guaranteed to run the same speed every time (it runs in a shared environment), so comparing the logs to desired output would be difficult. This software DOES print a line for each section of analysis it completes. Question: I was wondering if there was a good/elegant way (in a shell script) to capture output from a program and then kill the program when a given line/# of lines is output by the program?

    Read the article

  • ADO.NET: Can't connect to mdf database file

    - by Nabo
    I'm writing an application that uses a SQL Server 2005 database. In the connection string i'm specifying the mdf file like this: connstr = @"Data Source=.\SQLEXPRESS; AttachDbFilename=" + fileLocation + "; Integrated Security=True; User Instance=True"; When i execute the code: public static void forceConnection() { try { conn = new SqlConnection(connstr); conn.Open(); } catch (Exception e) { MessageBox.Show(e.Message, "Erro", MessageBoxButtons.OK, MessageBoxIcon.Error); } finally { if(conn != null) conn.Close(); } } I receive an exception: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) This code works on XP but not in Vista.. I tryed run Visual Studio in admin mode and moved the mdf file to "user data" folders but the error persists.. Any help? Thanks!

    Read the article

  • Forwarding HTTP Request with Direct Server Return

    - by Daniel Crabtree
    I have servers spread across several data centers, each storing different files. I want users to be able to access the files on all servers through a single domain and have the individual servers return the files directly to the users. The following shows a simple example: 1) The user's browser requests http://www.example.com/files/file1.zip 2) Request goes to server A, based on the DNS A record for example.com. 3) Server A analyzes the request and works out that /files/file1.zip is stored on server B. 4) Server A forwards the request to server B. 5) Server B returns file1.zip directly to the user without going through server A. Note: steps 4 and 5 must be transparent to the user and cannot involve sending a redirect to the user as that would violate the requirement of a single domain. From my research, what I want to achieve is called "Direct Server Return" and it is a common setup for load balancing. It is also sometimes called a half reverse proxy. For step 4, it sounds like I need to do MAC Address Translation and then pass the request back onto the network and for servers outside the network of server A tunneling will be required. For step 5, I simply need to configure server B, as per the real servers in a load balancing setup. Namely, server B should have server A's IP address on the loopback interface and it should not answer any ARP requests for that IP address. My problem is how to actually achieve step 4? I have found plenty of hardware and software that can do this for simple load balancing at layer 4, but these solutions fall short and cannot handle the kind of custom routing I require. It seems like I will need to roll my own solution. Ideally, I would like to do the routing / forwarding at the web server level, i.e. in PHP or C# / ASP.net. However, I am open to doing it at a lower level such as Apache or IIS, or at an even lower level, i.e. a custom proxy service in front of everything.

    Read the article

  • Need a VB Script to check if service exist

    - by Shorabh Upadhyay
    I want to write a VBS script which will check if specific service is installed/exist or not locally. If it is not installed/exist, script will display message (any text) and disabled the network interface i.e. NIC. If service exist and running, NO Action. Just exit. If service exist but not running, same action, script will display message (any text) and disabled the network interface i.e. NIC. i have below given code which is displaying a message in case one service is stop but it is not - Checking if service exist or not Disabling the NIC strComputer = "." Set objWMIService = Getobject("winmgmts:"_ & "{impersonationLevel=impersonate}!\\" & strComputer & "\root\cimv2") Set colRunningServices = onjWMIService.ExecQuery _ ("select State from Win32_Service where Name = 'dhcp'") For Each objService in colRunningServices If objService.State <> "Running" Then errReturn = msgbox ("Stopped") End If Next Please help. Thanks in advance.

    Read the article

  • How can I disable HTML5 content in popular browsers like Firefox and Chrome?

    - by HRJ
    The bad thing about Flash video was that it required a third-party plugin to play the content. The good thing was I could select which content to play; using the click-to-play feature in Firefox and Chrome. But now that HTML5 video is getting popular, I see a lot of ads popping into view again. They are not only a distraction, they hog resources on my computer and make the fans spin full speed. Is there a way to disable HTML5 audio/video content by default, and enable it only selectively?

    Read the article

  • How to get the spec of a machine on Linux?

    - by machinePurchaser
    I am interested in getting the spec of a machine, because I am thinking getting a similar server. What I am mostly interested in knowing is the number of cores / CPUs / etc., the amount of memory, the speed of the CPUs, the CPU cache size, and any other detail which is important for performance. My question is two-fold: Which parameters should I be interested in other than the ones I specified above? Is there an easy way to read them off the machine in Linux? cat /proc/cpuinfo reveals a lot about the CPUs, for example... What about memory (would rather not rely on top), etc?

    Read the article

  • MySQL gzipped Export in PhpMyAdmin has wrong size in Mozilla

    - by Michal Gow
    That is really strange. I am using PhpMyAdmin 2.11.9.6 on Linux hosting. While I am Exporting databases using "gzipped" compression in Mozilla, I am getting files which have size of uncompressed database, but they seems to be downloading in incredible speed (10 times quicker than is possible using my ISP). So at the end: for database of 10M size I am getting 10M gzip downloaded in miniseconds it has indeed shown 10M size on drive it is corrupted Zip compression is working just fine (I am getting file with cca 1M size with fine content of compressed database) And the weirdest thing: that is happening for Mozilla Firefox (13.0.1) only, Internet Explorer 9 is downloading correct gzipped files... Any hint?

    Read the article

  • Installing Windows on multiple computers

    - by Rob
    At our work we've decided to buy SSDs to speed up our computers. We'd also like to upgrade from Windows 7 to Windows 8.1 (read clean install) I'm talking about 10 computers which going to have a fully clean installation. Is there any trick I can use to avoid installing Windows 10 times? The computers (most of them) are different in hardware, so I think duping is no good option. What can I do? I'd like to spend as less time as possible because all computers are going to need SQL Management Studio and Visual Studio 2012/2013. Thanks!

    Read the article

  • using one disk as cache for others

    - by HugoRune
    Hi Given a PC with several hard drives: Is it possible to use one fast disk as a giant file cache? I.e. automatically copying frequently accessed data to that one disk, and transparently redirecting reads and writes to that disk, so that other drives would only have be accessed occassionally. (writes would have to be forwarded to the other disks after a while of course) Advantages: the other drives could be powered down most of the time; reducing power, heat, noise speed of the other drives would not matter much. cache disk could be solid state. How can I set such a system up? What OS supports these options? Is this possible at all using Windows or Linux?

    Read the article

  • How to empty the Directory-Name-Lookup-Cache?

    - by mutzel
    I'm having speed problems with a Windows software that is looking up files in several directories on a NTFS formatted HDD. In order to find out why and in which settings the problem is occurring I set up a test scenario at my local PC. But this did not work as well as I expected. After the program did the first scan on all directories any additional scans on the same directories are done way faster than at the first scan. I assume that the directory names and there containing file names were cached by Windows. Is it somehow possible to either disable or empty this cache, if it is even existing?

    Read the article

  • New Windows 7 Install Crashing

    - by bobber205
    One big reboot crash and one smaller crash already, 15 minutes in. Did a basic install of Windows 7, installed Chrome and Firefox. I had just finished loading up my gmail account in Chrome/Firefox to show the speed difference and we'd thought it would be hilarious to see how slow IE8 was. :P Just about as IE8 was done opening, the computer's screen goes black. After a restart and a couple minutes, Explorer crashes as well. What is going on? This install is only 15-20 minutes old. :P

    Read the article

  • How can I figure out which PHP extensions aren't being used?

    - by Tom Marthenal
    I manage a server (running Ubuntu) which hosts our client's sites with a few dozen different PHP-based websites, mostly small sites but also some installations of CMSes and forums. I used the get_loaded_extensions() method to see what extensions I have loaded. To help streamline the server (remove unnecessary extensions to make upgrading easier and marginally improve speed), I'd like to remove extensions that aren't being used by any of the sites. I currently have 54 different extensions loaded. I can easily eliminate some of these from the list which I know are used, but others I am less sure about. Is there some way that I can see extensions which have not been used recently?

    Read the article

  • VM: Windows 7 host, Linux guest, VT-d?

    - by chx
    I am sick of the driver issues of Linux. So I am planning to switch to Windows 7 as a host and virtualize my Linux into it. My laptop has integrated Intel graphics and supports VT-d. For speed reasons I would like to assign that card to Linux. Now, Parallels could do it but this page says Note: If you have only one PCIe video adapter, its name will be grayed out in the PCI Devices list and you will not be able to allocate it to your virtual machines. I would be perfectly fine with a headless Windows 7 (I can remotely admin from other computers or just the Linux guest) -- is there any VM software that doesn't have this restriction?

    Read the article

  • Creating a seperate static content site for IIS7 and MVC

    - by JK01
    With reference to this serverfault blog post: A Few Speed Improvements where it talks about how static content for stackexchange is served from a separate cookieless domain... How would someone go about doing this on IIS7.5 for a ASP.NET MVC site? The plan so far: Register domain eg static.com, create a new website in IIS Manually copy the js / css / images folders from MVC as is so that they have the same paths on the new server Enable IIS gzip settings (js/css = high compression, images = none) Set caching with far future expiry dates <clientCache cacheControlCustom="public" /> in the web.config Never set any cookies on the static.com site Combine and minimize js / css Auto deploy changes in static content with WebDeploy Is this plan correct? And how can you use WebDeploy to deploy the whole web app to one server and then only the static items to another? I can see there is a similar question, but for apache: Creating a cookie-free domain to serve static content so it doesn't apply

    Read the article

  • Super slow time machine backup on my mac

    - by lowellk
    I just got a new 2TB drive which I'm trying to use as a time machine drive for my mac which has a 1TB drive. On my first time trying to back it up, I'm getting terrible throughput, not even 1GB per day (it's been running for 36 hours now). I erased the disk and tried to copy a large file to it and got the same slow speed. What can I do to diagnose this? Are there any tools which can inspect the disk and tell me if it's messed up? Thanks!

    Read the article

  • Client-Side caching on IIS7 doesn't seem to work

    - by thomasbtv
    I have set content caching on a specific folder by following the local web.config method. I don't think it works, and I would like to fix this. I activate the cache using the IIS / HTTP Headers / Common headers feature. I set them to 1 day of expiration. I opened a page with Google Chrome in private navigation, and then open the Network tab in the console. The first time I load the page, everything loads from the site, obviously. If I refresh the page, I see 2 types of loading in the Network console: the files from Google and Facebook and such have a status of 200, and a size of (from cache). the files from the folder for which I set the caching have a status of 304 and their size is displayed. So, I guess the caching setting doesn't work? Or does the 304 response means that it's loaded from the cache? If they aren't, how can I make it work ? Thanks !

    Read the article

  • Reduce durability in MySQL for performance

    - by Paul Prescod
    My site occasionally has fairly predictable bursts of traffic that increase the throughput by 100 times more than normal. For example, we are going to be featured on a television show, and I expect in the hour after the show, I'll get more than 100 times more traffic than normal. My understanding is that MySQL (InnoDB) generally keeps my data in a bunch of different places: RAM Buffers commitlog binary log actual tables All of the above places on my DB slave This is too much "durability" given that I'm on an EC2 node and most of the stuff goes across the same network pipe (file systems are network attached). Plus the drives are just slow. The data is not high value and I'd rather take a small chance of a few minutes of data loss rather than have a high probability of an outage when the crowd arrives. During these traffic bursts I would like to do all of that I/O only if I can afford it. I'd like to just keep as much in RAM as possible (I have a fair chunk of RAM compared to the data size that would be touched over an hour). If buffers get scarce, or the I/O channel is not too overloaded, then sure, I'd like things to go to the commitlog or binary log to be sent to the slave. If, and only if, the I/O channel is not overloaded, I'd like to write back to the actual tables. In other words, I'd like MySQL/InnoDB to use a "write back" cache algorithm rather than a "write through" cache algorithm. Can I convince it to do that? If this is not possible, I am interested in general MySQL write-performance optimization tips. Most of the docs are about optimizing read performance, but when I get a crowd of users, I am creating accounts for all of them, so that's a write-heavy workload.

    Read the article

  • How to handle knowledge handover effectively?

    - by Zizzencs
    Let's say a large enterprise opens a new office in (insert random location here) and want the new colleges up to speed as fast as possible. Let's also say this enterprise is a very typical one with a complex environment, lots of history and almost full lack of documentation. What's already been decided is that the new colleges will receive howto-style documentation for the most typical tasks and will get seme architecture documentation for some of the more complicated systems. Any ideas about improving this process? And more specifically, how should such a howto document look like to be helpful?

    Read the article

  • How to configure a Logitech USB headset on Fedora 14 [closed]

    - by Humble Debugger
    I have a Logitech USB headset http://www.amazon.com/gp/product/B003NREDG4 but I can't hear anything nor input anything through it. I am working on a Fedora 14 desktop. cat /proc/asound/cards 0 [Intel ]: HDA-Intel - HDA Intel HDA Intel at 0xfebdc000 irq 51 2 [Headset ]: USB-Audio - Logitech USB Headset Logitech Logitech USB Headset at usb-0000:00:1d.0-2, full speed /sbin/lsmod | grep -c snd 14 lsusb Bus 008 Device 002: ID 046d:c529 Logitech, Inc. Bus 008 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 007 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 006 Device 004: ID 046d:0a0b Logitech, Inc. ClearChat Pro USB Bus 006 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Please advise

    Read the article

  • How to send a direccion to the void using the Hosts file and without using 127.0.0.1?

    - by magallanes
    I have some name address that i want to send straight to the void using the HOSTS file but i don't want to use the 127.0.0.1. How can i do that?. Why?, I want to speed up some proccess but 127.0.0.1 is serving a webserver, so if i use 127.0.0.1 then this process will call my webserver, consuming resources and may be delaying the process. Right now, i am using 0.0.0.0 instead of 127.0.0.1 but i am not sure if it is correct. 0.0.0.0 crl.microsoft.com

    Read the article

< Previous Page | 509 510 511 512 513 514 515 516 517 518 519 520  | Next Page >