Search Results

Search found 61110 results on 2445 pages for 'generation time'.

Page 21/2445 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • c++ quick sort running time

    - by chnet
    I have a question about quick sort algorithm. I implement quick sort algorithm and play it. The elements in initial unsorted array are random numbers chosen from certain range. I find the range of random number effects the running time. For example, the running time for 1, 000, 000 random number chosen from the range (1 - 2000) takes 40 seconds. While it takes 9 seconds if the 1,000,000 number chosen from the range (1 - 10,000). But I do not know how to explain it. In class, we talk about the pivot value can effect the depth of recursion tree. For my implementation, the last value of the array is chosen as pivot value. I do not use randomized scheme to select pivot value. int partition( vector<int> &vec, int p, int r) { int x = vec[r]; int i = (p-1); int j = p; while(1) { if (vec[j] <= x){ i = (i+1); int temp = vec[j]; vec[j] = vec[i]; vec[i] = temp; } j=j+1; if (j==r) break; } int temp = vec[i+1]; vec[i+1] = vec[r]; vec[r] = temp; return i+1; } void quicksort ( vector<int> &vec, int p, int r) { if (p<r){ int q = partition(vec, p, r); quicksort(vec, p, q-1); quicksort(vec, q+1, r); } } void random_generator(int num, int * array) { srand((unsigned)time(0)); int random_integer; for(int index=0; index< num; index++){ random_integer = (rand()%10000)+1; *(array+index) = random_integer; } } int main() { int array_size = 1000000; int input_array[array_size]; random_generator(array_size, input_array); vector<int> vec(input_array, input_array+array_size); clock_t t1, t2; t1 = clock(); quicksort(vec, 0, (array_size - 1)); // call quick sort int length = vec.size(); t2 = clock(); float diff = ((float)t2 - (float)t1); cout << diff << endl; cout << diff/CLOCKS_PER_SEC <<endl; }

    Read the article

  • How to get current date and time from DB using SQLAlchemy

    - by bluish
    I need to retrieve what's the current date and time for the database I'm connected with SQLAlchemy (not date and time of the machine where I'm running Python code). I've seen this functions, but they don't seem to do what they say: >>> from sqlalchemy import * >>> print func.current_date() CURRENT_DATE >>> print func.current_timestamp() CURRENT_TIMESTAMP Moreover it seems they don't need to be binded to any SQLAlchemy session or engine. It makes no sense... Thanks!

    Read the article

  • Group MySQL Data into Arbitrarily Sized Time Buckets

    - by Eric J.
    How do I count the number of records in a MySQL table based on a timestamp column per unit of time where the unit of time is arbitrary? Specifically, I want to count how many record's timestamps fell into 15 minute buckets during a given interval. I understand how to do this in buckets of 1 second, 1 minute, 1 hour, 1 day etc. using MySQL date functions, e.g. SELECT YEAR(datefield) Y, MONTH(datefield) M, DAY(datefield) D, COUNT(*) Cnt FROM mytable GROUP BY YEAR(datefield), MONTH(datefield), DAY(datefield) but how can I group by 15 minute buckets?

    Read the article

  • Data not displayed first time in android after copying the file from assets to data folder

    - by Thinkcomplete
    Description I have used the code tip from http://www.reigndesign.com/blog/using-your-own-sqlite-database-in-android-applications/ to copy a pre-filled data file to the target and handled this in a asynch task Problem : On starting the application it gives error and shuts down first time, starting again without any change it works perfectly fine. So first time after the file is copied, the error comes but after that no issues. Please help Code attached:.. private class CopyDatabase extends AsyncTask<String, Void, Boolean> { private final ProgressDialog dialog = new ProgressDialog(BabyNames.this); protected void onPreExecute() { this.dialog.setMessage("Loading..."); this.dialog.show(); } @Override protected Boolean doInBackground(String... params) { // TODO Auto-generated method stub try { namesDBSQLHelper.createDatabase(); return null; } catch(IOException ioe){ ioe.printStackTrace(); } return null; } protected void onPostExecute(final Boolean success){ if (this.dialog.isShowing()){ this.dialog.dismiss(); } } }

    Read the article

  • Store time of the day in SQL

    - by nute
    How would you store a time or time range in SQL? It won't be a datetime because it will just be let's say 4:30PM (not, January 3rd, 4:30pm). Those would be weekly, or daily meetings. The type of queries that I need are of course be for display, but also later will include complex queries such as avoiding conflicts in schedule. I'd rather pick the best datatype for that now. I'm using MS SQL Server Express 2005. Thanks! Nathan

    Read the article

  • Time complexity O() of isPalindrome()

    - by Aran
    I have this method, isPalindrome(), and I am trying to find the time complexity of it, and also rewrite the code more efficiently. boolean isPalindrome(String s) { boolean bP = true; for(int i=0; i<s.length(); i++) { if(s.charAt(i) != s.charAt(s.length()-i-1)) { bP = false; } } return bP; } Now I know this code checks the string's characters to see whether it is the same as the one before it and if it is then it doesn't change bP. And I think I know that the operations are s.length(), s.charAt(i) and s.charAt(s.length()-i-!)). Making the time-complexity O(N + 3), I think? This correct, if not what is it and how is that figured out. Also to make this more efficient, would it be good to store the character in temporary strings?

    Read the article

  • User control attributes at design-time

    - by ciscoheat
    I'm testing a simple User Control in Visual Studio 2008: A Panel named Wrapper with some controls inside. Can Visual Studio handle this at design time? public partial class TestControl : System.Web.UI.UserControl { [Description("Css class of the div around the control.")] [CssClassProperty] public string CssClass { get { return Wrapper.CssClass; } set { Wrapper.CssClass = value; } } } When setting the CssClass property, it doesn't update the css of the Panel at design time. Am I hoping for too much?

    Read the article

  • Time Complexities of recursive algorithms

    - by Peter
    Whenever I see a recursive solution, or I write recursive code for a problem, it is really difficult for me to figure out the time complexity, in most of the cases I just say its exponential? How is it exponential actually? How people say it is 2^n, when it is n!, when it is n^n or n^k. I have some questions in mind, let say find all permutations of a string (O(n!)) find all sequences which sum up to k in an array (exponential, how exactly do I calculate). Find all subsets of size k whose sum is 0 (will k come somewhere in complexity , it should come right?). Can any1 help me how to calculate the exact complexity of such questions, I am able to wrote code for them , but its hard understanding the exact time complexity.

    Read the article

  • MySQL Split Time Ranges into Smaller Chunks

    - by Neren
    Hello all, I've recently been tasked with finishing a PHP/MySQL web app when the developer quit last week. I'm no MySQL expert, so I apologize if this is an intensely simple question. I've searched SO for the better part of two days trying to find a relatively easy solution to my problem, which is as follows. Problem in a Nutshell: I have a MySQL table full of start and end datetime (GMT -5) & UNIX Timestamp values covering durations of irregular length and need to break/split/divide them into more-regular time chunks (5 minutes). I'm not after a count of row entries per time chunk/bucket/period, if that makes any sense. Data Example: started, ended, started_UNIX, ended_UNIX 2010-10-25 15:12:33, 2010-10-25 15:47:09, 1288033953, 1288036029 What I'm hoping to get: 2010-10-25 15:12:33, 2010-10-25 15:15:00, 1288033953, 1288037700 2010-10-25 15:15:00, 2010-10-25 15:20:00, 1288037700, 1288038000 2010-10-25 15:20:00, 2010-10-25 15:25:00, 1288038000, 1288038300 2010-10-25 15:25:00, 2010-10-25 15:30:00, 1288038300, 1288038600 2010-10-25 15:30:00, 2010-10-25 15:35:00, 1288038600, 1288038900 2010-10-25 15:35:00, 2010-10-25 15:40:00, 1288038900, 1288039200 2010-10-25 15:40:00, 2010-10-25 15:45:00, 1288039200, 1288039500 2010-10-25 15:45:00, 2010-10-25 15:47:09, 1288039500, 1288039629 If you're interested, here's the quick & dirty on the app and why I need the data: App overview: The application receives very simple POST requests generated by a basic sensor device when its input pins go to ground, which submits an INSERT query to the database where MySQL records a timestamp (as started). When the input pins return from a grounded state, the device submits a different POST request, which causes the PHP app to submit an UPDATE query, where a modification time timestamp is inserted (as ended). My employer recently changed the periodic reporting unit of measure from Seconds "On" Per Day to Seconds "On" Per 5 Minute Interval. I had formulated what I thought would be a workable solution, but when I looked at it on paper, it looked like Rube Goldberg's nightmare constructed in MySQL, so that was out. Any suggestions as to how to break these spans into 5 minute blocks? Keeping it all in MySQL would be my preference, though I'll take any suggestions. Thank you for any suggestions you may have. Again, I apologize if this is a no-brainer. If I ask any additional questions of the SO collective consciousness in the future, I'll try to word them a bit better. Any help will be happily welcomed. Thanks, Neren

    Read the article

  • Rails: How do I get created_at to show the time in my current time zone?

    - by Schneems
    Seems that when i create an object, the time is not correct. You can see by the script/console output below. Has anyone encountered anything like this, or have any debugging tips? >> Ticket.create(...) => #<Ticket id: 7, from_email: "[email protected]", ticket_collaterals: nil, to_email: "[email protected]", body: "hello", subject: "testing", status: nil, whymail_id: nil, created_at: "2009-12-31 04:23:20", updated_at: "2009-12-31 04:23:20", forms_id: nil, body_hash: nil> >> Ticket.last.created_at.to_s(:long) => "December 31, 2009 04:23" >> Time.now.to_s(:long) => "December 30, 2009 22:24"

    Read the article

  • c++ thread running time

    - by chnet
    I want to know whether I can calculate the running time for each thread. I implement a multithread program in C++ using pthread. As we know, each thread will compete the CPU. Can I use clock() function to calculate the actual number of CPU clocks each thread consumes? my program looks like: Class Thread () { Start(); Run(); Computing(); }; Start() is to start multiple threads. Then each thread will run Computing function to do something. My question is how I can calculate the running time of each thread for Computing function

    Read the article

  • Using system time directly to get random numbers

    - by Richard Mar.
    I had to return a random element from an array so I came up with this placeholder: return codes[(int) (System.currentTimeMillis() % codes.length - 1)]; Now than I think of it, I'm tempted to use it in real code. The Random() seeder uses system time as seed in most languages anyway, so why not use that time directly? As a bonus, I'm free from the worry of non-random lower bits of many RNGs. It this hack coming back to bite me? (The language is Java if that's relevant.)

    Read the article

  • long waiting time in linking

    - by ccanan
    Hi, here is the situation. I am using visual studio 2005. the solution contains lots of projects, 34 projects in all, and the start up projects depends on others. then in linking part, it'll wait a long time before the real linking starts. I am pretty sure it's because of too many projects depended, as when I use a solution with 10 of the 34 projects(keep other projects as headers&libs), it'll start instantly. so any one has any idea that I can reduce the waiting time? thx.

    Read the article

  • Oracle Insurance Unveils Next Generation of Enterprise Document Automation: Oracle Documaker Enterprise Edition

    - by helen.pitts(at)oracle.com
    Oracle today announced the introduction of Oracle Documaker Enterprise Edition, the next generation of the company's market-leading Enterprise Document Automation (EDA) solution for dynamically creating, managing and delivering adaptive enterprise communications across multiple channels. "Insurers and other organizations need enterprise document automation that puts the power to manage the complete document lifecycle in the hands of the business user," said Srini Venkatasanthanam, vice president, Product Strategy, Oracle Insurancein the press release. "Built with features such as rules-based configurability and interactive processing, Oracle Documaker Enterprise Edition makes possible an adaptive approach to enterprise document automation - documents when, where and in the form they're needed." Key enhancements in Oracle Documaker Enterprise Edition include: Documaker Interactive, the newly renamed and redesigned Web-based iDocumaker module. Documaker Interactive enables users to quickly and interactively create and assemble compliant communications such as policy and claims correspondence directly from their desktops. Users benefits from built-in accelerators and rules-based configurability, pre-configured content as well as embedded workflow leveraging Oracle BPEL Process Manager. Documaker Documaker Factory, which helps enterprises reduce cost and improve operational efficiency through better management of their enterprise publishing operations. Dashboards, analytics, reporting and an administrative console provide insurers with greater insight and centralized control over document production allowing them to better adapt their resources based on business demands. Other enhancements include: enhanced business user empowerment; additional multi-language localization capabilities; and benefits from the use of powerful Oracle technologies such as the Oracle Application Development Framework for all interfaces and Oracle Universal Content Management (Oracle UCM) for enterprise content management. Drive Competitive Advantage and Growth: Deb Smallwood, founder of SMA Strategy Meets Action, a leading industry insurance analyst consulting firm and co-author of 3CM in Insurance: Customer Communications and Content Management published last month, noted in the press release that "maximum value can be gained from investments when Enterprise Document Automation (EDA) is viewed holistically and all forms of communication and all types of information are integrated across the entire enterprise. "Insurers that choose an approach that takes all communications, both structured and unstructured data, coming into the company from a wide range of channels, and then create seamless flows of information will have a real competitive advantage," Smallwood said. "This capability will soon become essential for selling, servicing, and ultimately driving growth through new business and retention." Learn More: Click here to watch a short flash demo that demonstrates the real business value offered by Oracle Documaker Enterprise Edition. You can also see how an insurance company can use Oracle Documaker Enterprise Edition to dynamically create, manage and publish adaptive enterprise content throughout the insurance business lifecycle for delivery across multiple channels by visiting Alamere Insurance, a fictional model insurance company created by Oracle to showcase how Oracle applications can be leveraged within the insurance enterprise. Meet Our Newest Oracle Insurance Blogger: I'm pleased to introduce our newest Oracle Insurance blogger, Susanne Hale. Susanne, who manages product marketing for Oracle Insurance EDA solutions, will be sharing insights about this topic along with examples of how our customers are transforming their enterprise communications using Oracle Documaker Enterprise Edition in future Oracle Insurance blog entries. Helen Pitts is senior product marketing manager for Oracle Insurance.

    Read the article

  • BizTalk 2009 - The Community ODBC Adapter: Schema Generation with Input Parameters

    - by Stuart Brierley
    As previsouly noted in my post on Schema Generation using the Community ODBC Adapter, I ran into a problem when trying to generate a schema to represent a MySQL stored procedure that had input parameters.  After a bit of investigation and a few deadends I managed to figure out a way around this issue - detailed below are both the problem and solution in case you ever run into this yourself. The Problem Imagine a stored procedure that is coded as follows in MySQL: StuTest(in DStr varchar(80)) BEGIN   Declare GRNID int;   Select grn_id into GRNID from grn_header where distribution_number = DStr;   Select GRNID; END This is quite a simple stored procedure but can be used to illustrate the issue with parameters quite niceley. When generating the schema using the Add Generated Items wizard, I tried selecting "Stored Procedure" and then in the Statement Information window typing the stored procedure name: StuTest Pressing generate then gives the following error: "Incorrect Number of arguments for Procedure StuTest; expected 1, got 0" If you attempt to supply a value for the parameter you end up with a schema that will only ever supply the parameter value that you specify.  For example supplying StuTest('123') will always call the procedure with a parameter value of 123. The Solution   I tried contacting Two Connect about this, but their experience of testing the adapter with MySQL was limited. After looking through the code for the ODBC adapter myself and trying a few things out, I was eventually able to use the ODBC adapter to call a test stored procedure using a two way send port. In the generate schema wizard instead of selecting Stored Procedure I had to choose SQL Script instead, detailing the following script: Call StuTest(@InputParameter) By default this would create a request schema with an attribute called InputParameter, with a SQL type of NVarChar(1).  In most cases this is not going to be correct for the stored procedure being called. To change the type from the default that is applied you need to select the "Override default query processing" check box when specifying the script in the wizard.  This then opens the BizTalk ODBC Override window which lets you change the properties of the parameters and also test out the query script.  Once I had done this I was then able to generate the correct schema, which included an attribute representing the parameter.  By deploying the schema assembly I was then able to try the ODBC adapter out on a two way send port. When supplied with an appropriate message instance (for the generated request schema) this send port successfully returned the expected response.

    Read the article

  • ODI 11g - Dynamic and Flexible Code Generation

    - by David Allan
    ODI supports conditional branching at execution time in its code generation framework. This is a little used, little known, but very powerful capability - this let's one piece of template code behave dynamically based on a runtime variable's value for example. Generally knowledge module's are free of any variable dependency. Using variable's within a knowledge module for this kind of dynamic capability is a valid use case - definitely in the highly specialized area. The example I will illustrate is much simpler - how to define a filter (based on mapping here) that may or may not be included depending on whether at runtime a certain value is defined for a variable. I define a variable V_COND, if I set this variable's value to 1, then I will include the filter condition 'EMP.SAL > 1' otherwise I will just use '1=1' as the filter condition. I use ODIs substitution tags using a special tag '<$' which is processed just prior to execution in the runtime code - so this code is included in the ODI scenario code and it is processed after variables are substituted (unlike the '<?' tag).  So the lines below are not equal ... <$ if ( "#V_COND".equals("1")  ) { $> EMP.SAL > 1 <$ } else { $> 1 = 1 <$ } $> <? if ( "#V_COND".equals("1")  ) { ?> EMP.SAL > 1 <? } else { ?> 1 = 1 <? } ?> When the <? code is evaluated the code is executed without variable substitution - so we do not get the desired semantics, must use the <$ code. You can see the jython (java) code in red is the conditional if statement that drives whether the 'EMP.SAL > 1' or '1=1' is included in the generated code. For this illustration you need at least the ODI 11.1.1.6 release - with the vanilla 11.1.1.5 release it didn't work for me (may be patches?). As I mentioned, normally KMs don't have dependencies on variables - since any users must then have these variables defined etc. but it does afford a lot of runtime flexibility if such capabilities are required - something to keep in mind, definitely.

    Read the article

  • Response time increasing (worsening) over time with consistent load

    - by NJ
    Ok. I know I don't have a lot of information. That is, essentially, the reason for my question. I am building a game using Flash/Flex and Rails on the back-end. Communication between the two is via WebORB. Here is what is happening. When I start the client an operation calls the server every 60 seconds (not much, right?) which results in two database SELECTS and an UPDATE and a resulting response to the client. This repeats every 60 seconds. I deployed a test version on heroku and NewRelic's RPM told me that response time degraded over time. One client with one task every 60 seconds. Over several hours the response time drifted from 150ms to over 900ms in response time. I have been able to reproduce this in my development environment (my Macbook Pro) so it isn't a problem on Heroku's side. I am not doing anything sophisticated (by design) in the server app. An action gets called, gets some data from the database, performs an AR update and then returns a response. No caching, etc. Any thoughts? Anyone? I'd really appreciate it.

    Read the article

  • Next Generation Mobile Clients for Oracle Applications & the role of Oracle Fusion Middleware

    - by Manish Palaparthy
    Oracle Enterprise Applications have been available with modern web browser based interfaces for a while now. The web browsers available in smart phones no longer require special markup language such as WML since the processing power of these handsets is quite near to that of a typical personal computer. Modern Mobile devices such as the IPhone, Android Phones, BlackBerry, Windows 8 devices can now render XHTML & HTML quite well. This means you could potentially use your mobile browser to access your favorite enterprise application. While the Mobile browser would render the UI, you might find it difficult to use it due to the formatting & Presentation of the Native UI. Smart phones offer a lot more than just a powerful web browser, they offer capabilities such as Maps, GPS, Multi touch, pinch zoom, accelerometers, vivid colors, camera with video, support for 3G, 4G networks, cloud storage, NFC, streaming media, tethering, voice based features, multi tasking, messaging, social networking web browsers with support for HTML 5 and many more features.  While the full potential of Enterprise Mobile Apps is yet to be realized, Oracle has published a few of its applications that take advantage of the above capabilities and are available for the IPhone natively. Here are some of them Iphone Apps  Oracle Business Approvals for Managers: Offers a highly intuitive user interface built as a native mobile application to conveniently access pending actions related to expenses, purchase requisitions, HR vacancies and job offers. You can even view BI reports related to the worklist actions. Works with Oracle E-Business Suite Oracle Business Indicators : Real-time secure access to OBI reports. Oracle Business Approvals for Sales Managers: Enables sales executives to review key targeted tasks, access relevant business intelligence reports. Works with Siebel CRM, Siebel Quote & Order Capture. Oracle Mobile Sales Assistant: CRM application that provides real-time, secure access to the information your sales organization needs, complete frequent tasks, collaborate with colleagues and customers. Works with Oracle CRMOracle Mobile Sales Forecast: Designed specifically for the mobile business user to view key opportunities. Works with Oracle CRM on demand Oracle iReceipts : Part of Oracle PeopleSoft Expenses, which allows users to create and submit expense lines for cash transactions in real-time. Works with Oracle PeopleSoft expenses Now, we have seen some mobile Apps that Oracle has published, I am sure you are intrigued as to how develop your own clients for the use-cases that you deem most fit. For that Oracle has ADF Mobile ADF Mobile You could develop Mobile Applications with the SDK available with the smart phone platforms!, but you'd really have to be a mobile ninja developer to develop apps with the rich user experience like the ones above. The challenges really multiply when you have to support multiple mobile devices. ADF Mobile framework is really handy to meet this challenge ADF Mobile can in be used to Develop Apps for the Mobile browser : An application built with ADF Mobile framework installs on a smart device, renders user interface via HTML5, and has access to device services. This means the programming model is primarily web-based, which offers consistency with other enterprise applications as well as easier migration to new platforms. Develop Apps for the Mobile Client (Native Apps): These applications have access to device services, enabling a richer experience for users than a browser alone can offer. ADF mobile enables rapid and declarative development of rich, on-device mobile applications. Developers only need to write an application once and then they can deploy the same application across multiple leading smart phone platforms. Oracle SOA Suite Although the Mobile users are using the smart phone apps, and actual transactions are being executed in the underlying app, there is lot of technical wizardry that is going under the surface. All of this key technical components to make 1. WebService calls 2. Authentication 3. Intercepting Webservice calls and adding security credentials to the request 4. Invoking the services of the enterprise application 5. Integrating with the Enterprise Application via the Adapter is all being implemented at the SOA infrastructure layer.  As you can see from the above diagram. The key pre-requisites to mobile enable an Enterprise application are The core enterprise application Oracle SOA Suite ADF Mobile

    Read the article

  • C# XNA: Effecient mesh building algorithm for voxel based terrain ("top" outside layer only, non-destructible)

    - by Tim Hatch
    To put this bluntly, for non-destructible/non-constructible voxel style terrain, are generated meshes handled much better than instancing? Is there another method to achieve millions of visible quad faces per scene with ease? If generated meshes per chunk is the way to go, what kind of algorithm might I want to use based on only EVER needing the outer layer rendered? I'm using 3D Perlin Noise for terrain generation (for overhangs/caves/etc). The layout is fantastic, but even for around 20k visible faces, it's quite slow using instancing (whether it's one big draw call or multiple smaller chunks). I've simplified it to the point of removing non-visible cubes and only having the top faces of my cube-like terrain be rendered, but with 20k quad instances, it's still pretty sluggish (30fps on my machine). My goal is for the world to be made using quite small cubes. Where multiple games (IE: Minecraft) have the player 1x1 cube in width/length and 2 high, I'm shooting for 6x6 width/length and 9 high. With a lot of advantages as far as gameplay goes, it also means I could quite easily have a single scene with millions of truly visible quads. So, I have been trying to look into changing my method from instancing to mesh generation on a chunk by chunk basis. Do video cards handle this type of processing better than separate quads/cubes through instancing? What kind of existing algorithms should I be looking into? I've seen references to marching cubes a few times now, but I haven't spent much time investigating it since I don't know if it's the better route for my situation or not. I'm also starting to doubt my need of using 3D Perlin noise for terrain generation since I won't want the kind of depth it would seem best at. I just like the idea of overhangs and occasional cave-like structures, but could find no better 'surface only' algorithms to cover that. If anyone has any better suggestions there, feel free to throw them at me too. Thanks, Mythics

    Read the article

  • Terrible ping time with TP-Link wireless router

    - by rabbid
    I am literally a foot away from this useless TL-WR340G/TL-WR340GD router and check out this ping time: 64 bytes from 192.168.1.8: icmp_seq=291 ttl=64 time=9477.516 ms 64 bytes from 192.168.1.8: icmp_seq=292 ttl=64 time=8954.423 ms 64 bytes from 192.168.1.8: icmp_seq=293 ttl=64 time=8262.836 ms 64 bytes from 192.168.1.8: icmp_seq=294 ttl=64 time=7937.853 ms 64 bytes from 192.168.1.8: icmp_seq=295 ttl=64 time=7517.768 ms 64 bytes from 192.168.1.8: icmp_seq=296 ttl=64 time=7106.063 ms 64 bytes from 192.168.1.8: icmp_seq=297 ttl=64 time=6492.109 ms 64 bytes from 192.168.1.8: icmp_seq=298 ttl=64 time=5835.305 ms 64 bytes from 192.168.1.8: icmp_seq=299 ttl=64 time=5314.897 ms 64 bytes from 192.168.1.8: icmp_seq=300 ttl=64 time=4902.705 ms 64 bytes from 192.168.1.8: icmp_seq=301 ttl=64 time=4716.959 ms 64 bytes from 192.168.1.8: icmp_seq=302 ttl=64 time=5224.450 ms 64 bytes from 192.168.1.8: icmp_seq=303 ttl=64 time=5024.079 ms 64 bytes from 192.168.1.8: icmp_seq=304 ttl=64 time=5044.100 ms 64 bytes from 192.168.1.8: icmp_seq=305 ttl=64 time=4477.990 ms 64 bytes from 192.168.1.8: icmp_seq=306 ttl=64 time=3582.432 ms 64 bytes from 192.168.1.8: icmp_seq=307 ttl=64 time=2911.896 ms At this time mine is the only computer using the router. This happens from time to time. I'd restart the router, and then it'll have a 1-2 ms ping for a while, and then back to terrible ping. Is it just a poor quality router? Suggestions? Thank you

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >