Search Results

Search found 6670 results on 267 pages for 'speed dial'.

Page 151/267 | < Previous Page | 147 148 149 150 151 152 153 154 155 156 157 158  | Next Page >

  • Decent 1gb switch (16-24 port) for rack...

    - by TomTom
    Hallo, for a rack containing a smaller nubmer of servers (5 at the moment, going to stay in this area), I look to replace the currently aging 100mbit switch with a 1gb switch. This is for the backend between the servers. I expect some ISCIS traffic there ,so a 10gbit option would be nice (preferably for two ports, as extension modules). I dont need management, this is a pure backend of an internal cluster. I do VLAN, but there is no sensible management the switch can do there. I wuold like: * 1he only, obviously * preferable limited moving parts. * Low price ;) * Enough power to run at least half the ports in full speed at the same time. Anyone any recommendations?

    Read the article

  • How to configure an ASUS motherboard.

    - by Absolute0
    I have an ASUS P7P55-M motherboard running with an Intel Core i5-750 processor and 4 GB RAM with 1600 MT/s speed. For some reason the default settings of the motherboard make all the components run at half their optimums. I have switched to the "D.O.C.P." profile and supposedly everything is as it's supposed to be (verified with CPU-Z). There is also an "X.M.P." profile and a manual one. Are either of the DOCP or XMP safe to go with? I wouldn't use the manual mode as I would likely mess something up real bad. XMP seems to be more memory oriented.

    Read the article

  • A space-efficient guest filesystem for grow-as-needed virtual disks ?

    - by Steve Schnepp
    A common practice is to use non-preallocated virtual disks. Since they only grow as needed, it makes them perfect for fast backup, overallocation and creation speed. Since file systems are usually based on physical disks they have the tendency to use the whole area available1 in order to increase the speed2 or reliability3. I'm searching a filesystem that does the exact opposite : try to touch the minimum blocks need by an aggressive block reuse. I would happily trade some performance for space usage. There is already a similar question, but it is rather general. I have very specific goal : space-efficiency. 1. Like page caching uses all the free physical memory 2. Canonical example : online defragmentation 3. Canonical example : snapshotting

    Read the article

  • Where should I plug in my monitor -- Motherboard or Graphics card?

    - by Jeremy White
    Assuming I am using the following equipment... motherboard with HDMI/DVI & no embedded graphics discrete graphics card (nVidia or ATI) on PCI-E slot Intel CPU with integrated graphics ...where should I plug my monitor into the computer? Presumably, I'll get the fastest speed on games connected directly to the graphics card. But there is also power savings when connecting to the motherboard and accessing the Intel on-board graphics. I've read that some motherboards can switch automatically between the Intel graphics and discrete graphics. Is that something that works well, and where do I connect the monitor to enable that?

    Read the article

  • Uploading with browser makes all other browser tabs and devices disconnect

    - by fabsenet
    Whenever I upload a video to YouTube all other browser tabs behave like there is no connection at all. It even affects my phone and other computers on the network therefor I think it has to do with my router. When the upload is done everything works normally again. I never observed this behavior with any other upload. My router is a Fritz!Box 7390 and my uploading PC is connected through a 1000mb/s switch (wired) to the router. Uploading through another browser does not change anything. I understand that other sites become slow as the network resources are limited, but stopping altogether feels wrong. speed.io measures for my internet connection: 40.894 Kbit/s down, 2.685 Kbit/s up, 29 ms ping, 2.048 con/m connects

    Read the article

  • Looking for internet traffic manager software for Windows 7

    - by Semyon Perepelitsa
    I have 128 KB/s (1 Mbit/s) internet connection. Not quite fast for downloading big files, but it is okay for internet browsing. When I start big file dowload, I cannot surf the web normally: pages appears slowly, especially pictures there. Therefore I pause downloading, but I'm regularly spending much time on reading articles or leaving the computer alone, when I can allow the file being dowloaded. Is there any software that can automize the process of pausing and resuming download or simply automatically regulate internet speed for each application using Internet? I download files using different software and protocols (Download Master for http, ftp; uTorrent for torrents; many other programs for updating themselves), so I don't want to be tied to particular program for downloading.

    Read the article

  • Will I have internet connection issues next day, if I unplug router at night?

    - by headskracher77
    I did this regularly a few years ago and used to feel like I was 'being punished' by the Interent svc provider for disabling access to my computer, because trying to re-connect the next day became a constant pain. I have the same linksys router, a comcast modem, and hi-speed broadband through their LAN. Question: who or what is at fault for lousy internet connections, slow connections, or no connections: (everybody's tech dept. blames everybody else) The router? 10 year olds, maybe obsolete? The modem? came with the service plan - can connect three devices on a sharedconnection. The ISP: I read they not only even control and completely regulate bandwidth usage, but they also ration it!! (true?) So can I safely 'pull the plug' each night for security or not? thnx

    Read the article

  • 10 System LAN latency with ADSL modem as gateway

    - by itsoft3g
    Recently I expanded LAN in my office from 3 to 10 computers. Structure star topology, one ADSL Modem connected to One Switch which is again connected to 10 computers. Also we have Wifi device Netgear which is connected from switch. ADSL Modem acts as the DHCP Server, all the system will have default gateway IP (ADSL Modem's IP) Network latency is now become very high, All the chat severs disconnect often like google talk, skype etc, also internet become very very slow. when all the computer turned on. We have 4 Mbps Download and 100 Kbps upload Net speed. Its look like ADSL Modem cannot able to handle all the connections. I tried to setup a system as default gateway which will connect to modem, not sure how to do this. Please advice on this.

    Read the article

  • Grep all files in a directory and print matches with file name

    - by javanix
    I have a list of log files that I create as part of a video encoding script that I wrote. I would like to search all of them and print out certain statistics from the encode - how fast they were encoded, what settings were used, etc. I can search for the average framerate in one file via this 1 liner: cat ${filename} | grep average which outputs: work: average encoding speed for job is 23.211176 fps and search for the ratefactor: cat ${filename} | grep RF I would like to search all files in the directory and print off one, or prefereably both pieces of information along with the filename. Is there any way I can use find or grep to get this in a one-liner, or do I need to write a script? I would like output like this: /home/javanix/filename.log <RF line> <average line> I would like this to either work using FreeBSD 9 or Ubuntu 12.04.

    Read the article

  • GPU Computing - # of GPUs supported

    - by TehTypoKing
    I currently have a desktop with 6 GPUs ( 3x HD 5970s ) in non-crossfire mode. Unfortunately, it seems that Windows 7 64bit only supports up to 4 GPUs. I have not been able to find a reliable source to deny or confirm this. If windows 7 has this limitation, is there a Linux flavor that supports more than 4 GPUs? In-case you are wondering, this is not for gaming but high-speed single precision computing. With this current setup ( if I can find 6gpu support ) I am looking to reach 13.8 Teraflops. Also, my motherboard does support 3 16x pci-xpress gen2 slots... and I have a 1500w powersupply plugged into a 20amp outlet. Windows is able to detect all 6 cores.. although, 2 of which displays the warning "Drivers failed to load". To recap: - Can windows support 6 GPUs? - If not, does Linux? Thank you.

    Read the article

  • Site-to-site VPN

    - by ronadona
    We are a small business company that is based in Sydney and opened a new office in London. Number of employees in Sydney office is 25 and in London is 6 employees. So the traffic isn't that high. Files to be transferred are Excel sheets with size of 15mb max. Both locations have MS server 2008 and Fortigate gateways. I set up a site to site vpn but it's extremely slow. Maybe this is because our upload speeds is 1Mbps only but We will increase the upload speed to 20 Mbps in both locations but I am afraid that this will not solve the problem as the 2 locations are far from each other and the upload upgrade won't solve the problem. what's the best way to go? Shall we find a provider for the VPN? or is there another technology that can be used through internet without paying extra costs? Many thanks!

    Read the article

  • Is there a faster way to deploy an OVA template?

    - by Luke
    I need to deploy vSphere Server Appliance 5.1. I have vSphere Client running locally and my internet upload is capped at 3 Mbps. It says it's going to take about 200 minutes to upload. When selecting a URL as opposed to a local file, does vSphere Client download it locally and then upload, or does it download the OVA directly to the server? My goal is to avoid waiting 3 1/2 hours for this to upload. If specifying a URL isn't any faster, are there any other methods that would allow me to deploy from the datacenter instead of my office? We don't have any Windows VM's installed on our cluster. So unfortunately I don't have a Windows machine with faster upload speed.

    Read the article

  • What is the best filesystem for storing thousands of files in one dictionary-like id-blob structure?

    - by Ivan
    What filesystem best suits my needs? Thousands or even millions of files in one directory. Good (ext4 & ntfs level or close) reliability (incl. fault tolerance) and access speed. No directories actually needed, as well as descriptive names, just a dictionary-like structure of id-blob pairs is all I need. No links, attributes, and access control features needed. The purpose is a file storage where all the metadata (data describing all the facts about what the file actually contains and who can access it) is stored in a MySQL database. As far as I know common filesystems like NTFS and ext3/4 can go dead-slow if there are too many files placed in one directory - that's why I ask.

    Read the article

  • How to reinstall OS X 10.8 without internet?

    - by Sam
    I had a crash on my Macbook Air MD231 - Mid 2012, and my system won't boot, and unfortunately my friend erase my partition from disk utility. Now I have three question 1)my internet speed is very low, for re-installing I have to download all of the OS X 10.8 from internet? or I can download it and install it via USB port?(how??) 2)I have to select this option ? "re-install OS X" , and what about size of that? (for example 4.2 Gb) 3)after installing , is there anyway to restore my dada? unfortunately I have not any backup right now, after I turning my Mac on, it's automatically going to repair mode... thnx for your kind

    Read the article

  • How can I combine non-identical disks efficiently?

    - by Odys
    There are some not-identical disk of various capacities that I want to combine (somehow). Since there are no duplicate models, I can't use raid between none of them. Is there a way to use them efficiently while being safe? What I have in mind is a software that will use them as if it were Raid-5 or something. I really don't care about max speed. I want in the end to have as less logical drives as possible. Also, I don't mind spending some money on hardware, if needed.

    Read the article

  • Extracting streaming video from swf

    - by user1018733
    I am trying to speed up this video by 1.25x to 1.5x, udacity style. I tried grabbing the http://smart.if.uidaho.edu/courses/f2012/cs512/ using the Download Flash & Video Firefox plugin, but it only grabbed the swf player that streams the video. Is there any other way to download this video? (as well as future videos). As a final solution, I can use video capture software while I'm not using the computer, but I don't want that hassle and time delay if possible.

    Read the article

  • What is the computer "doing" when it is running slow and task manager is not showing any CPU activity?

    - by Joakim Tall
    Typical example is when shutting down a memoryintensive application. It can take quite a while before the computer gets back up to speed. Is there some inherent cost in releasing memory? Or is it throttled by some kind of harddrive activity, and if so is there any good way to track that? I usually bring up task manager when a computer is running slow, and usually sorting by cpu activity can show what process is causing the problem, but sometimes there is no activity showing. And yes I "show processes from all users", I have been wondering this since the days win2k :)

    Read the article

  • application that could track amount of downloads

    - by user23950
    My ISP only limits me for downloading about 25gb a month. After exceeding the limit, my speed goes down by half for another month. And its really a pain. I'm real addicted on downloading stuff from the internet. My question is: Is there an application that can track the amount/size of downloads in a month. Is there a trick that I could use to fool the eyes of my ISP. If they say 25 Gb limit in a month. Does it include the webpages, manga streams, video & audio streams. Or just direct download and p2p.

    Read the article

  • Best method(s) to backup VMs running on HyperV?

    - by Kara Marfia
    We're in the middle of P2V'ing most of the network, so the current backup method is likely the worst - the backup agent is still installed on the guest OSs, and the backup device is dutifully pulling them onto tape, one file at a time. I suspect there's a clever way to script (PowerShell?) a suspend on the VMs, then backup of the .vhd files, and unsuspend the VMs. This seems like it would provide big speed benefits, while losing file-level restore (might be best for things like DCs and app servers). What methods/policies have you hammered out?

    Read the article

  • How to setup a fast VPN server

    - by Saif Bechan
    I am trying to set up a VPN that has a fast download speed. The server I have is a linux server and from there I can download 2 megabytes a second. At home I can also download with 2 megabytes a second. All the downloads I do are from the same source, no different server. Now I have set up a VPN connection between my home and the server, and now I am only downloading 64 kilobytes a second! The connection I have created is a PPTP server on a debian machine. Now my question is if it is possible to optimize this connection. Should I maybe switch to OpenVPN, or change operating systems? Or are there some kind of settings to tweak to make the connection optimal. PS. The server I am running is on a XEN node. I have done the proper ip forwarding.

    Read the article

  • Flash Media Server slow over SSL

    - by Antilogic
    We are using FMS to host a VoD site. We host FMS internally (we do not use a CDN). We recently installed an SSL certificate to alleviate connection issues for clients (they're networks either block or don't support RTMP), however we're noticing that when streaming in RTMPS connections are drastically slower (on the order of Mbps). I know SSL causes some amount of over head but both client and server show almost no signs of exertion. Speedtest.net and a locally hosted speed test confirm that bandwidth is not an issue. I'm really not a network guru, so I'm at a loss as to where to check next. Do any of you have an idea why streaming media would run so slow over SSL?

    Read the article

  • Request Multiple Maya Floating Server Licenses for extra Satellite clients

    - by Rob
    Hello all: I am currently setting up a 'render farm' for Maya 2008 Unlimited. One Maya workstation license comes with the ability to render on eight satellite nodes. It works perfect, the remote rendering works like a charm. However, we have additional boxes to set up as satellite rendering nodes, and we have extra Maya workstation licenses. Ideally, the workstation can take two licenses and thus render on 16 nodes, but I haven't been able to figure it out, or determine if it is actually possible. It's a big project, where rendering the entire thing is in the scope of weeks, so the speed up would be worth it. Any thoughts?

    Read the article

  • When is it time to buy a new hard drive, and what considerations go into buying a new hard drive?

    - by user1125620
    I've had my current hard drive for about 4-5 years now, and I've never had a problem with it before, but now it's making whirring noises. It's done this before and, last time, the noise did go away the next day, but I have accumulated quite a bit of information that I wouldn't want to lose on the drive. HD Tune Pro and Berlac Advisor both said the drive was healthy, and I wouldn't want to get a new one unless it was absolutely necessary or was going to show drastic performance improvements. My only knock against the drive would be that Visual Studio takes longer to load than I'd like it to. HD Tune Pro says the average read speed is 54.3MB/s. I'm not sure if that's good or bad, but it seems about average compared to similar drives on http://www.hdtune.com/testresults.html. Model #: WDC WD5000AAJS-22YFA0 So, should hard drives be replaced after a certain amount of time? Has mine reached that point? Would a new hard drive be any faster?

    Read the article

  • Rejecting new HTTP requests when server reaches a certain throughput

    - by Sam
    I have a requirement to run an HTTP server that rejects new HTTP requests (with a 503, or similar) when the global transfer rate of current HTTP responses exceeds a certain level. For example, if the web server is transferring at 98Mbps, and a new HTTP request arrives, we would want to reject this (as we couldn't guarantee a good speed). I've had a look at mod_cband for Apache, limit_req for nginx, and lighttpd's rate limiting features, but none of them seem to handle my (rather contrived, granted) use case. I should add that I'm open to using pretty much any web server, and am open to implementing this in iptables rules if someone can craft such a rule! (Refusing the TCP connection is fine, it doesn't have to respond with an HTTP 503). Any suggestions?

    Read the article

  • Symlink across local volumes in webroot?

    - by geerlingguy
    I am looking for a good short-term solution to storage space concerns on my website. Currently, I have all uploaded files (flash video, images, etc.) inside the 'files' directory in my web root (/home/account/public_html/files). That directory is located on my high-speed main hard drive (a 15k SCSI drive). I have another drive with much more capacity, but spinning at 10k rpm (so still fast, but not as good for random reads/writes as the main drive. The entire drive is mounted at /backup Right now I'm just using it as a backup volume. I would like to create a symlink from my /home/account/public_html/files folder to /backup/files, and have all files reside on the second drive. However, if someone accesses a file at http://www.example.com/files/filename.jpg, would it still work if I symlinked to the second drive? (Basically, would Apache/PHP automatically know to follow the symlink for that directory?).

    Read the article

< Previous Page | 147 148 149 150 151 152 153 154 155 156 157 158  | Next Page >