Search Results

Search found 17233 results on 690 pages for 'download speed'.

Page 64/690 | < Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >

  • Analyze a wireless network that constantly drops/has speed issues

    - by Eddie Parker
    Hello, I'm curious what the best tools are to use for analyzing problems with a WiFi network. Here's the scenario: I have a WiFi router (Belkin N+) that's setup in AP mode. I have three RT-N13U's that I've purchased to use as 'repeaters', but I've had so many problems when more than one of them is running (bad routes) that I've only got one active. Sometimes certain boxes on my network can't talk to others, and drops are quite frequent and quite aggravating. I'm running Mac, Windows, and Linux (Gentoo) boxes on this network, so any software, or steps I should take that work for any of those boxes should be sufficient. Apologies if this is answered somewhere else - I'll close it as a dupe if so.

    Read the article

  • Shaping outbound Traffic to Control Download Speeds with Linux

    - by Kyle Brandt
    I have a situation where a server makes lots of requests from big webservers all at the same time. Currently, I have not control over the amount of requests or the rate of the requests from the application that does this. The responses from these webservers is more than the internet line can handle. (Basically, we are launching a DoS on ourselves). I am going to get push to get this fixed at the application level, but for the time being, is there anyway I can use traffic shaping on the Linux server to control this? I know I can only shape outbound traffic, but maybe there is a way I can slow the TCP responses so the other side will detect congestion and this will help my situation? If there is anything like this with tc, what might the configuration look like? The idea is that the traffic control might help me control which packets get dropped before they reach my router.

    Read the article

  • Baseline upload speed for streaming video on demand?

    - by Peter Turner
    Is there a table that shows what is necessary for streaming video on demand? Specifically what I'm trying to do is stream Video on Demand to 50-100 people. I'm flexible with the format, but I'd like text on the screen to be legible (i.e. not 320x240). If someone knows of a calculator or something that would let me figure out exactly how I need to structure my video and ISP that'd be handy too.

    Read the article

  • Java update "Failed to download required installation files."

    - by therefromhere
    On my Windows 7 machine the Java update consistently fails with this error message. This seems to a common problem, does anyone have an insight into what's going wrong, and is there a fix for it other than either disabling the update check (seems a bad idea from a security point of view) or waiting for the nag message and then manually installing the new version (annoying and stupid from a usability point of view). Note that I did install the previous version manually (updating from 6.23 to 6.30 I think?), thinking that might resolve the issue, but no luck.

    Read the article

  • Why does a pdf file download result in varying bytes logged, all with sc-status 200

    - by Pat James
    I have a mojoportal CMS installation on an IIS7 server where users are reporting problems downloading a pdf file. It always downloads fine for me and most others, either displaying in browser or in Adobe Reader. Using logparser to query the IIS logs, all the responses are status 200 (OK) or 304 (Not modified), but the bytes sent vary quite a bit. Sometimes zero, some 211, some about half the full file size of 27059, and lots in between. Plenty show the full size of 27059. Do these other entries for smaller byte counts represent errors of some kind, correlating with the problems reported? Is this likely to be a browser/client issue or a server side problem? If there is any other info that would be helpful let me know. This is a shared hosting server though so I am somewhat limited in what I can dig into on the server.

    Read the article

  • Download JDK onto a remote server

    - by itsadok
    I want to get the latest JDK onto a server in a remote location. Downloading the JDK from Sun's website requires jumping through all kinds of hoops until you actually get the file. I'm not sure exactly if they use cookies or my IP address, but simply copying the file URL and trying wget on the server doesn't work. Googling for mirrors of the JDK, I could only find old versions. Right now I'm left with the option of downloading it into my computer, then uploading it to the server. This feels slow and stupid. Anyone got a better idea? EDIT: Thanks for all the replies. Just to clarify, as I'm writing this I'm rsyncing the 78MB file to my server. It should be done in about an hour, so it's not such a big deal. However, since this is not the first time I'm doing this, I was hoping for a better solution for next time. Solution: What I ended up doing was sudo aptitude install lynx-cur www-browser http://java.sun.com/javase/downloads/ From there it's mostly using the arrow and enter keys, and answering "Yes" to a lot of lynx security questions (about cookies and certificates). Thanks to resonator.

    Read the article

  • Host timeout during file upload/download over SFTP/SSH

    - by kritop
    I tried different clients because i thought its client related, but all of them sooner or later disconnect or stop uploading/downloading files, and getting a timeout disconnect. After a reconnect it works again for a bit of time. Really strange cannot figure out the reason. I'm on a mac and the server is a debian VPS! If u need further informations ask please! I appreciate any tips, because i'm kinda stuck!

    Read the article

  • Increase data transfer speed through bonding/lacp?

    - by Matteo
    I want to maximize the throughput of a data transfer between two servers. The copy will be made at the application layer using Robocopy. To clear things up, please check my Visio schema of the network: FS1---------(SW1)===========(SW2)--------- FS2 SW1 to SW2 is connected through 10 gigabit Fiber Channel ethernet FS1 to SW1 is connected through 1 gigabit ethernet FS2 to SW2 is connected through 1 gigabit ethernet I first idea I've come up with is to use LACP, so I could use two Gigabit Ethernet between each server and the switch. A collegue told me that LACP is for availability and not performance, so he reckon this solution will not work. Is he right? Do I have other options? Thank you very much

    Read the article

  • How to use mencoder to speed up video ?

    - by timday
    There's a webcam streaming WMV video that I can play/record with mplayer/mencoder (Debian Lenny including Debian-multimedia) just fine. I'd like to capture the video such that, when played back, it's faster (so say 100s of real time zooms past in every 1s of video). The "frameskip" filter seems like it ought to do this, but when I try e.g mencoder -o output.avi -ofps 25 -vf framestep=100 -ovc lavc -lavcopts vcodec=mpeg4:... <mms source URL> what I actually get is something which plays for the same length of time as it's allowed to capture for, but which only updates the displayed image every 4 seconds. Is there any way mencoder can be persuaded to do what I want ? (By the way, ideally I'd like to average the frames together to obtain motion-blur instead of just dropping/discarding most of them.).

    Read the article

  • Increase backup speed, Backup Exec 2010 - QNAP TS419U+ NAS

    - by user99912
    We have a QNAP NAS and the network shares are being backed up by Backup Exec 2010 over SMB. We can't install the remote agent on the NAS as it has an ARM processor and, as far as I am aware, there is no compatible agent. Do you have any suggestions on any faster method of backing up these shares as opposed to the current scenario? Currently the network bandwidth is not the issue, it seems that this access method is just not able to go any quicker. We've also added the NAS shares to the start of the selection list, but we're still running into 18 hours total backup time (total amount of data on the NAS is roughly 650GB). Any comments and/or suggestions welcome. EDIT: Data is being pulled from the NAS by Backup Exec to a LTO4 tape drive

    Read the article

  • My php homepage downloads index.php instead of being processed on Gandi.net

    - by alekone
    if I go to the homepage of my website http://www.website.com (on a brand new server) the index.php gets donwloaded instead than processed. I don't have the same problem on other folders. my .htaccess reads: AddHandler php5-script .php what could this be? I suspect it's something with php config / or htaccess, but I'm not able to figure it out. help please! edit: I don't know if this helps: it's a wordpress installation, I have this problem only on the public part of the website, not on the admin (that renders correctly)

    Read the article

  • Change location of RSS Dynamic Desktops

    - by Andy
    I'm currently using CCleaner to take care of my computer, but I also have a dynamic desktop background provided by Bing (I'm running Windows 7 HP) - and unfortunately the two conflict. Whenever I 'clean' my computer using CCleaner it messes up my destop backgrounds as they are stored in the temporary internet files directory, and for some reason I don't appear to be able to get as far as the 'Enclosures' sub directory in order to tell CCleaner to exclude the directory (I can see it in Windows Explorer but not in CCleaner's directory browser). Therefore, I am looking for an alternative solution to this problem and wondered if I could change the directory to which the images were downloaded on the RSS feed. If anybody knows how to do this, I would be grateful if you could share or indeed, I would be equally as greatful if anyone knows any other ways of getting around CCleaner. Please note that I don't want to stop cleaning the whole of my temporary internet files though - I just don't want the wallpapers that have been downloaded to be deleted... Thanks in advance!

    Read the article

  • Netgear router-speed problem after XBox use

    - by John Dudley
    When my son is at my place at the weekend, he plays XBox Live over the internet, using my wireless network (Netgear 'g' type router). This usually thrashes & crashes the router to the extent I have to hard-boot it to get it working again. However, after this weekend, on my two laptops, I'm left with the problem that the router is working, but I'm only getting 0.38Mbps out of it, at all times of the day. I've tried hard-booting the router, but no difference. Could this be a knock-on effect of the Xbox use? I can't believe the router could be damaged, but working slower? Is that possible? Tiscali haven't come back to me yet on any 'network' issues. Thanks in advance

    Read the article

  • Optimizing wireless router speed and minimizing interference.

    - by Tchalvak
    I've been experiencing problems with my wireless connectivity lately, and want to make sure that it's not related to the abundance of other wireless routers here in my building. So, what I'm looking for is a method (probably via some application or another) to audit the wireless channels (and other factors that might be important that I don't even know of yet) that are floating through the aether around me. Ubuntu or other linux apps are preferred, but some kind of windows/mac solution is possible, since I do have other OSes around me that I could install & test on. Router: netgear WGT624 v3 Hearsay tells me that channels 1, 6, and 11 are "non-overlapping" (I expect they aren't used for non-wireless-router purposes or something, not sure how they couldn't overlap with other routers using other channels), so perhaps my best choices of channel are limited, so if channels aren't really a big concern, I'd be happy to get links to other optimizations that I should look into.

    Read the article

  • Can access website but images, css stylesheets and javascript files do not download

    - by Triztian
    i have this problem, not sure about the source of it, Basically the title describes the issue, I can access the webpage and see the html structure, but no resources are being donwloaded nor I have access to them using the browser that means, no javascript, no css styles and no images., any solutions?, Im using tomcat by the way. EDIT 1 If I access the tomcat manager from within the server it also blocks the images. I'm running on windows server 2008 R2.

    Read the article

  • Downloading a large site with wget

    - by Evan Gill
    Hi, I'm trying to mirror a very large site but wget never seems to finish properly. I am using the command: wget -r -l inf -nc -w 0.5 {the-site} I have downloaded a good portion of the site, but not the whole thing. The content does not change fast enough to bother using time-stamping. After running overnight, this message appears: File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. Killed does anyone know what is happening and how I can fix it?

    Read the article

  • Mac: Script application downloaded from the Internet

    - by Svish
    I downloaded a php framework and has started to make a website using that. Sometimes I need to look at the source of that framework and every time I open a file I haven't opened before I get this message: “somefile.php” is a script application which was downloaded from the Internet. Are you sure you want to open it? That is ok and nice I suppose, but I am getting tired of it. Is there a way I can fix all the files in my web directory so that the os somehow forgets the files are from the Internet or something like that?

    Read the article

  • How to speed up apache

    - by Zen_silence
    We have a server with 8Cores, 16GB of RAM and RAID 0 SAS 10K drives. Our goal is to use this to serve a fairly simple php application quickly. We have tested all other components and we think we have narrowed it down to apache is our bottleneck. I am no apache guru I have done some research and tested a couple things but when i test with JMeter launching 100 concurrent connections against the server the first 10 - 20 come back quickly 30 - 100ms but the rest take between 1000ms to 3000ms. Anyone have any ideas on what to change in our apache config to make this faster right now its a vanilla install of apache.

    Read the article

  • PHP file gets download instead of getting executed when browsed in any browser

    - by baltusaj
    I have a phpinfo.php file which I am trying to run by browsing to it using browser but the browser downloads the file instead of executing it. phpinfo.php <?php phpinfo(); ?> I followed following this post, added following lines to my /etc/apache2/httpd.conf and restarted apache but invain. phpinfo.php still gets downloaded. AddType application/x-httpd-php .php .phtml AddType application/x-httpd-php-source .phps Have I added these line to the correct file? On an openSuSE forum following was mentioned. I followed this too but still no success. Same problem is persisting. In case the browser wants to save your php files instead of displaying the content, you should enable php support in the /etc/apache2/mod_userdir.conf file. Add the following line to it, just after the line and restart the server. Include /etc/apache2/conf.d/php5.conf

    Read the article

< Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >