Search Results

Search found 17233 results on 690 pages for 'download speed'.

Page 48/690 | < Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >

  • Network latency and speed of light

    - by James
    This was kinda of covered by the following Is minimum latency fixed by the speed of light? , but i would like to add the follow up a bit. The scenario is as follows; we have two opposing sites one on the West Coast of the US and one in Ireland. The customer is in central Europe, and has requested a latency test. Ireland gives responses of ~65-70ms. However the West Coast guys claim to be faster with a response of 60ms. Now a quick check says that light in fiber would take about 42ms to make the trip to the States and 8.5ms to Ireland. So obviously this is a single hop and does not include routers, switches, firewalls, protocol overhead etc. Would I be right to call BS on their figures? As a final note I tested a ping to Google IP address that was allegedly on the west coast from a site that covered a similar distance and was amazed to get a response time of 20ms. Suggesting ICMP packets that travel twice the speed of light. So A) what am I missing B) Am I right to suspect shenanigans? UPDATE: Guys thanks so far for your help and I have been reading various previous questions on this. About 5 years I had an issue where the hop from the UK to Ireland added 10ms of latency no matter what we did. In the end I moved the servers; So imagine my surprise when I have guys that claim they are 5ms faster with a transatlantic trip. So again should I call BS? Oh and assume both sites are normal mortals that don't have access to Google magical routing, warp dives or flux capacitors. :)

    Read the article

  • Having an issue trying to get Gigabit speed across my network (Ubuntu Server)

    - by user94217
    I've just started looking into the network speeds at my office, the entire network is setup to be "Gigabit". This includes Gb switches, Gb Network cards and Cat 5e cabling. I'm not expecting the full speed, I just want more than ~90 Mb/s. I've been running some tests with iperf the linux tools and checking the hardware with ethtool. I have 3 servers and when doing my checks/test I discovered that the two backup servers can access each other at around 450 Mb/s but when using either one of them to connect and test the main server, I only get the 90Mb/s even though ethtool shows the networking card running at 1000/Full. The only difference between all the server/networking cards is the "Port" which ethtool shows. On the two backup servers the "Port" is shown as MII yet on the other it's shown as "Twisted Pair". When using ethtool -s to manually set the "Port" to MII on the main server, it looses all connectivity and does not show "Speed" or "Duplex". Anyway, Am i doing something wrong? Is there a specific reason my main server cannot use Gb when there appears to be no difference except the "Port"?

    Read the article

  • Why is my connection slow?

    - by Jay R.
    I have a Dell Precision T5400 with a Broadcom 1Gb onboard NIC. For some strange reason, when I access machines on our local network, the best I can get is around 125KB/s download speed. My laptop that has a 10/100Mb NIC onboard usually gets around 300KB/s or better from the same network resource. Both machines are plugged into the same 1Gb switch which connects to our local network wall jack at 100Mb half duplex. There is also a printer plugged into the same switch at 100Mb full. The resource I'm using for the test is a 30MB zip file copied from a jetty webserver that is running as part of a cruisecontrol installation. The cruisecontrol installation is running WindowsXP with full real-time antivirus and Altiris patch management and inventory running. That stuff on its own is eating some of the download speed. I've seen the laptop reach into the multiple MB/s download speed before, but the desktop never seems to get past 125KB/s to 130KB/s. In WindowsXP, before I upgraded the driver in the desktop, it was that slow. In Fedora, it is still slow even though it appears to be using the same driver version as the upgraded Windows driver. The upgraded Windows driver is faster, but still not nearly as fast as the laptop. What gives? Any insight to improve the situation would be appreciated. Could it be that the BroadCom board just isn't that good, or the driver in linux is just not as good as the Windows one?

    Read the article

  • Over gigabit connection, Teracopy does 31MB/s, but Windows 8 does it at ~109MB per second?

    - by Gaurang
    I got my brain-melting first taste of Gigabit networking today, between my 2011 MacMini and Windows 8 Pro desktop connected via Cat.5e to Linksys WRT320N(sporting dd-WRT). After making sure that the line speed on both systems showed 1Gbps, I proceeded to copying a 2.4GB MP4 from the Mini to the Win 8 desktop (SMB sharing). Although satisfied with the 30-34 MB/s that Teracopy was showing (that was a proper step-up for me from 10 MB/s), I still was curious about this massive difference in the advertised and real-world speed. 2 hours of Google had me believing that there were other factors that resulted in less speed, SMB being one. So just for the sake of doing it, I iPerf'd both the systems and guess what that showed - around 875mbps on both systems! I then stumbled upon this little piece of info after which I turned off Teracopy and copied the same file through Windows 8's regular copier. 109 MB/s. Molten brains :) What exactly is causing this? And can I enable such speeds via Teracopy? I really dig the extra features that Teracopy has, will surely miss them now :D

    Read the article

  • Google PageSpeed, optimizing Google's own elements

    - by mowgli
    I'm trying Google's PageSpeed online service Ironically, it's primarily highlighting Google's own services as something that needs improvement on my site 1) jQuery from Google: blocking. So I moved all javascript from <head> to the end of the document before </body>. That helped 2) Linking to external Google Font CSS (in <head>): blocking. But the font is critical to the design of the page and should load before much else 3) Google Analytics: Caching is not good. (Google has set it internally to 2 hours expiration). Don't know how to change this (this is also placed at the bottom of page) The Google Font is highlighted as a big priority to change. How can I fix this? Where/how should I call the the font?

    Read the article

  • How to See What Web Sites Your Computer is Secretly Connecting To

    - by Lori Kaufman
    Has your internet connection become slower than it should be? There may be a chance that you have some malware, spyware, or adware that is using your internet connection in the background without your knowledge. Here’s how to see what’s going on under the hood. Secret Squirrel by akumath HTG Explains: When Do You Need to Update Your Drivers? How to Make the Kindle Fire Silk Browser *Actually* Fast! Amazon’s New Kindle Fire Tablet: the How-To Geek Review

    Read the article

  • Examples of MMOs with small dev teams

    - by Aliud Alius
    I'd like to see a list of MMOs with small development teams in order to better understand where small teams have found a place for themselves. While examples of MMORPGs are of interest, so are games focusing on socializing, trade and commerce, city or empire building, crafting, exploration, strategy and so on. Any shipping game supporting between, say 800 and 10,000 simultaneous players belongs on this list. Thanks.

    Read the article

  • Determine web page draw time via a program

    - by Kevin Burke
    Google Chrome has a nice tool to determine the time the page begins drawing, in the Network tab in Developer Tools. Similarly sites like webpagetest.org can tell you the draw time and give you the whole waterfall of page loads for a given web page. I was wondering if I could automate the process of finding the time it took to the first page draw, for all of the pages on my site, so I can share this data within my company. Obviously the page draw time will depend on the latency and throughput of your connection, but I'm more concerned with the relative data about pages on our site. Can I get this data from Selenium or another tool? Thanks, Kevin

    Read the article

  • wordpress sites are slow on shared hosting but plain html/css sites are fast

    - by sam
    ive got a shared hosting account, unlimited sites, unlimited gb, unlimited bandwidth ect ect. Of course because its shared and a cheap one at that theres too many sites on each server and it all runs slow due to lack of ram. What ive found is that my plain html/css/js sites run an awful lot faster than my wordpress sites on this hosting and i was trying to work out why. Im not exactly sure how a browser sends a request for a page and the full process of request and delivery, but are my html sites running faster as they are just serving code to the browser, where as the wordpress sites are having to make calculations from the database to make each page before its delivered .. is that correct, or am i completly off course ?

    Read the article

  • DNS lookup when using a CDN

    - by Steven Wu
    Using a CDN can vastly improve the load time of a website. I been thinking of using it to host all my external files like CSS, JS, Images, Videos etc. However I was thinking when linking to a CDN, wouldn't the browser have to use additional DNS lookup? So wouldn't this be counter productive? Or is the benefit to host every external files on a CDN out weighs the additional cost of a DNS lookup? What are your thoughts?

    Read the article

  • How is this site so fast?

    - by user8628
    how is the website http://dftba.com/ so fast? when i click a link it loads right then? what makes it work like this? how do i make it work like this on my site? some of the objects on the site are being hosted by a website called ecogeek-cdn.net? who is this company and why do they host the images of this site? i have been looking into this site some time because i want this site to be like mine site they site use Apache they site use Python (when asked the developer told me this) they site use jquery and jqueryui they site is custom built not using wordpress they site is ownedhosted by liquidweb they site gets a million users a month they site launched in january they site uses cpanel they site does not have SSH or FTP (i tried to connect but it denied me all) they does have SSH and FTP but only allowed by their addresses Please; my english is not as good as yours

    Read the article

  • Suitable SDK to develop quick game?

    - by gRnt
    I'm currently undertaking a personal project at home that I need to turn around inside the next few months (which working full time and still learning programming makes it a tad difficult). I'm looking for suggestions on SDK's or tools (preferably free or that come with games, similar to steam tools) that I can use to develop a "game". I'm OK with coding but have no 3D development skills at all. I've very little experience with mod tools or SDK's at all but I'm hoping someone can point me in the direction of one that does the following: A decent library of prefab 3D models to build scenes. Ability to add scripting to the scene I've used Unity before and would prefer to continue to do so however I really have the worst 3D skills imaginable and can't waste time learning them. I'd be looking for pre-fab items that are both industrial and possibly more lush environments (trees etc). If it makes any difference (due to licencing and what-not) I WILL NOT be selling this game or marketing it in any way and I am a University Student if any places do educations licences. Another alternative would be to source free 3d models elsewhere but again while I'm still learning I have no idea where to look if someone could point me in the right direction I'll do the rest of the digging. Thanks

    Read the article

  • Serving images from different domain

    - by Tom Gullen
    Google audit: Serve static content from a cookieless domain (15) 2.65KB of cookies were sent with the following static resources. Serve these static resources from a domain that does not set cookies: If my domain is widgets.com, should I set up a img.widgets.com that servers these resources? How beneficial is this? Edit I setup img.widgets.com to serve images from, and changed all images to this URL. But I still get that message?

    Read the article

  • How can I approach creating an efficient algorithm for maximizing value with these specific constraints?

    - by sway
    I'm having trouble coming up with an approach that isn't n^2 for this problem. Here's a contrived, simplified version I've come up with: Let's say you're a company that needs 4 employees to launch in a new city, a manager, two salespeople, and a customer support rep, and you magically know how much impact every candidate will have and how much salary they require to take the job. Your table of potential employees looks something like this: Name Position Salary Impact Adam Smith Manager 60,000 11 Allison Brown Salesperson 40,000 9 Brad Stewart Manager 55,000 9 ...etc (thousands of records) What algorithmic approach can be taken to find the maximum "impact" while still filling all the positions and remaining under, say, a 200,000 budget? Thanks!

    Read the article

  • Make blogger load faster

    - by Wladimir Ivanov
    all. I use blogger as a platform for electronic music blog. Because of the thematics of the blog I embed many iframes (Youtube & Soundcloud). Of course this makes the articles to load slow. Almost each article on this blog consists of some text and many iframes below. What should I do in this particular case in order to make the articles (pages) load faster. Is there any available solution or I should use some jquery like lazy load to load iframes once the scroller reaches them? Any help is greatly appreciated.

    Read the article

  • Parallel downloading of JavaScript files on page load

    - by user359650
    Below is a quote from one of the Yahoo performance pages: While a script is downloading, however, the browser won't start any other downloads, even on different hostnames. When I look at page load of our website, I can see that many scripts are being downloaded at the same time: Am I mistaken, or should the quote should instead read like this? While scripts are downloading (there can be several scripts downloading at the same time), the browser won't start any other downloads, even on different hostnames.

    Read the article

  • Loading main javascript on every page? Or breaking it up to relevant pages?

    - by Kyle
    I have a 700kb decompressed JS file which is loaded on every page. Before I had 12 javascript files on each page but to reduce http requests I compressed them all into 1 file. This file is ~130kb gzipped and is served over gzip. However on the local computer it is still unpacked and loaded on every page. Is this a performance issue? I've profiled the javascript with firebug profiler but did not see any issues. The problem/illusion I am facing is there are jquery libraries compressed in that file that are sometimes not used on the current page. For example jquery datatables is 200kb compressed and that is only loaded on 2 of my website pages. Another is jqplot and that is another 200kb. I now have 400kb of excess code that isn't executed on 80% of the pages. Should I leave everything in 1 file? Should I take out the jquery libraries and load only relevant JS on the current page?

    Read the article

  • PHP and performance

    - by Naif
    I always hear that PHP is for medium and small websites whereas .NET and Java for enterprise applications. My question is about PHP. Why is PHP not a good option for enterprise web applications? Is it because if the web application becomes bigger then PHP will be slower as it is an interpreted language? I know that corporate world will choose .NET or J2EE because of the integration with their products and because of back end services, etc. However, if we just have PHP for building sites and web applications then how can we use it to perform well with big sites? In short, Is there a relationship between the performance of PHP and the size of the website? What are the factors that make PHP not appropriate option for big sites?

    Read the article

  • Page Load Time - "Waiting on..." taking ages. What part of page request process is hung?

    - by James
    I have a new cluster site running on Magento that's on a development server that is made up of 2 x web servers and 1 x database server. I have optimized the site in all areas I know (gzip, increasing php memory limits, increasing database memory limits etc) but sometimes the page loading gets stuck on 'waiting for xxx.xx.xx.xxx' (Chrome and other broswers, chrome just shows it that way). It can sit there for 40 + seconds, sometimes it just never loads and I close it in frustration. What part of the page loading process is this hung at? Is it a server issue, database issue, platform issue? I need to know where to start or whether to push the hosting provider about it.

    Read the article

  • Recommended hardware for developing ios games?

    - by Matthew
    I know you have to have mac os x to use xcode and thus to develop/compile apps for the iphone. And I'm not exactly wanting to go the hackintosh way, so I'm looking at buying a used mac. What specs are recommended. If I buy a cheap mac mini that has only 1gb of ram would that be enough? (I'm not talking about using that to create the graphics/audio, I'll use my normal windows/ubuntu pc for that). I'm just talking about being able to use xcode and write applications. I'm trying to spend the least amount I can without running into problems developing the app.

    Read the article

  • Why am I getting domainpark.cgi being called from my website?

    - by Sean
    I used to test my site on www.exampleone.com and now I have moved to the real domain www.realdomain.com now and www.exampleone.com is now parked by 1and1 (default). Now when I test to see which requests are made by the www.realdomain.comI see domainpark.cgi and park.js from Sedo Parking also being requested as well as the js that serves the ads by adclicks. How do I get rid of this? It's not on the index page at all, and it's causing a lot of strain and slowing my site down.

    Read the article

  • Can I host a high traffic website at home?

    - by eric01
    I've been searching this up on google but I can't formulate my google search with the right terms to find an accurate answer to my question. Is it possible to have a super-fast connection at home to host a high traffic website? What is the generic term for that kind of connection? What's the major drawback of hosting at home? (I have no idea of the price range but it's probably quite expensive) Do you have to be a company to have the right to own such a connection?

    Read the article

  • What is the impact of a CMS on page load time versus a static site?

    - by PleaseStand
    I am creating a 20-page site that will go on shared hosting. Each page will be about 20 KB (including HTML, CSS, and images common to all pages). To avoid manually adding navigation elements to each page, I am considering using a CMS. However, I am concerned that on a busy server, using a CMS would make the site load more slowly. In a shared hosting environment where PHP is run as a CGI binary, how much does a CMS (WordPress, Drupal, etc.) generally affect page load time, compared to both "plain HTML" static sites and those using PHP as merely a templating language?

    Read the article

  • Blogger Blog Takes Ages to Load after Custom Domain Redirection

    - by abhisek
    I recently bought a custom domain for a blogger blog (technabled.com) I have for sometime now. I followed the instructions on blogger's documentation. I added A-name records and CNAME records with my DNS provider. But, now, some strange problems are cropping up. If I connect to my broadband network and then ping technabled.com, it times out. Then, if I visit the webpage, which takes almost one and half minutes to load, and then if I ping technabled.com, it shows expected result. This is not just me. I asked some of the regular readers, who reported the same issue. As a result of this, I am losing a lot of visits. What is stranger is that the subsequent visits to the blog is faster. I have checked with a few online services to test the performance. WebPageTest seems to say the same thing: http://www.webpagetest.org/result/110117_1N_7PE/ (please see the First View / Repeat View time) Also, the pagespeed score is not that bad. So I am ruling out other possibilities. I am at a loss as to what I should do to find a solution. Help is much appreciated. :)

    Read the article

  • Windows gets progressively slower over time, why doesn't Ubuntu?

    - by William
    I, and many other previous Windows users notice that the computer seems to get progressively slower over time. I bought a leapfrog crammer only to find it installed process that sat there waiting for me to plug the crammer in so it could run the software. It took up three percent of the CPU twenty-four seven, seven day a week! This is one of the main reasons I left Windows. But, Ubuntu doesn't seem to slow down over time at all. Does Ubuntu allow programs to install background programs like the leapfrog crammer did to sit there like a leech and suck away at resources? Could someone explain why Windows tends to get slower over time, and is Ubuntu vulnrable to this too? Thanks for any help, this is puzzling me.

    Read the article

< Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >