Search Results

Search found 23762 results on 951 pages for 'network speed'.

Page 129/951 | < Previous Page | 125 126 127 128 129 130 131 132 133 134 135 136  | Next Page >

  • System With Two Network Adapters [closed]

    - by Synetech inc.
    Hi, My system has a NIC (Marvell Yukon) built-into the motherboard, but I also have a D-Link (RealTek) card. I figure that using the D-Link and disabling the Marvell makes the most sense, though I'm wondering if maybe the built-in one has better throughput (not that my Internet connection is so fast). Also, I'm wondering about the merits of using both at the same time. My router has four ports and I have experimented with enabling and plugging both NICs into the router. I was able to connect to the Internet, but the pattern of usage seemed irregular (which adapter was chosen for the transfer and any given point). I also considered bridging the two, but am having difficulty in finding out what exactly creating network bridge does in the context of the Windows Network Connections window. I am familiar with the concept of connecting networks, so it seems to me that birding two connections on the same segment is pointless at best (and can cause problems like loops?) Does anyone have any tips on what to do if a system has more than one NIC and any clarification on the bridge option? Thanks a lot.

    Read the article

  • Switch between network configurations via command line in fedora 17

    - by Mike Fairhurst
    I have two different setups I use on my work laptop; one enables synergy over an ethernet ssh tunnel with my work computer on the local network, and the other opens an HTTP tunnel to my work computer from outside the network. When I have wifi enabled at work, my laptop seems to use it by preference. This makes synergy run incredibly slowly. At home I must use wifi. I have scripts that begin my ssh tunnels, add my ssh keys, and starts up other programs like synergy, and close themselves when I shut my laptop. However, every day I have to start out my routine by opening my gnome-control-center and turning on my ethernet. I have tried route add and ifup, none of it works, so I dove into gnome-control-center's source code and found that it enabled the connection by libnm's method nm_client_activate_connection with some libnm specific structs that I am having trouble tracking down. I'm not much of a c programmer, and I'm not familiar with either GTK or libnm. Does anybody know what fedora 17 does with ethernet connections to fully enable them? Or does anybody know what libnm does to fully enable an ethernet connection? Do I have to write a c script to run libnm for me to fully emulate whatever gnome-control-center is trying to do?

    Read the article

  • How to get the speed of a network card on the command line?

    - by nelaar
    I am trying to see what the speed of some network cards on a remote server. Our reporting software says they are 10Mbps, but I am sure that is wrong they should be 1Gbps. Our monitoring software uses SNMP to query the servers, perhaps the servers are reporting information incorrectly. ifconfig does not report what the speed of the devices are. How can I see what the currently configured speed of the cards are.

    Read the article

  • Is there a way to change the default animation speed in PowerPoint?

    - by Tim
    I do a lot of PowerPoint (2007) presentations, and some of those involve some rather complex diagrams that I like to build step by step, using PowerPoint's animation feature. I prefer using the "fade in" style, but I find the default medium speed setting too low, so every time I'm adding an element to my animation, I have to go to the Speed dropdown box and select "Very fast". This is getting annoying, so I wonder if there is a way to tell PowerPoint that I'd like my standard animation speed (at least for fade-ins) to be "very fast"?

    Read the article

  • Network bandwidth usage dashboard?

    - by SkippyFlipjack
    I have a couple of wifi access points hooked up to my home network, one of which I keep unsecured for some development I do; there are only a couple other homes within range and they've got their own wifi so it's not a big concern. I also have a Sonos system, Tivo, Roku, a couple laptops, a couple phones, an iPad and a desktop machine, all of which are internet-smart. So when my internet bandwidth tanks and it takes five minutes to load a YouTube video, I want to know what's going on, and there are many potential culprits. I'd like to be able to plug my MacBook into the primary router and see a nice little dashboard of the units on the network and what kind of bandwidth each is using at that moment. I could figure this out from WireShark or tcpdump but figure there has to be an easier way. I've tried a few different commercial products but none really presented the right info. Suggestions? (This may be a question for superuser since my Apple Time Capsule's SNMP capabilities are limited, but I figure admins of small business networks would have dealt w/ the same issue..)

    Read the article

  • Scanning to network share

    - by tking
    In our environment we have several printers with the scan to folder functionality set up. It worked pretty well. Long story short- I needed to reboot our file server one evening (the server whose $hares where all the scans go), upon on boot up I received a message stating the name of the file server (i.e AcmeFS1; Windows Server 2003) could not be used because there was a duplicate name on the network. I found that somehow another printer on the network was using the name of the file server. Weird - I know. People could no longer scan to their folders. I ran an nbtstat on the file server and found that the local netbios name table was empty. I renamed the offending printer and rebooted the file server once more. This time, no error, and once I ran nbtstat again, I found that correct name and domain in the local netbios name table. Problem is, scan to folder is still not working. I know this was working before I rebooted the file server the first time. Anyone have any idea what is going on and how to fix? Thanks. FIXED: Not sure why the reboot didnt fix it but I restarted the "Server" service on the file server and the problem went away.

    Read the article

  • Can't Ping - Wireless network of home

    - by Naunidh
    Hello, This may seem like other ping problem, but I have tried a lot before posting it here. I have a linksys WRT54G - firmware v8.00.8. I have two laptops one windows vista (192.168.1.99) and Windows Xp (192.168.1.13) connected on WiFi . The Router's IP address is 192.168.1.4, and default gateway is the ADSL modem (192.168.1.1) connected through wire. The problem is that laptops can not ping each other, they can ping the gateway and the linksys router, and both can access internet. Following has been tried (I am pinging from XP machine to Vista): I saw that arp entires for Vista machines were not being populated, so I added static ARP entries. 192.168.1.99 00-19-7e-70-d0-4e static I checked on ethereal that an ICMP packet for MAC address of Vista machine does go out from XP machine towards the Vista machine, but never reaches the Vista machine. So its get eaten by the Router? I added Vista machine to DMZ in my linksys router, so that all the ports are open (In case it was an issue). Firewalls , antivirus etc were turned off, echo was enabled explicitly on vista, file sharing, network discovery were turned on. Network type was set to private. Unchecked everything in Router;s firewall, even though they are only meant for WAN requests. Is there anything else that I should try. Thanks.

    Read the article

  • IIS_IUSRS cannot access files uploaded and created by Network Service - error 401.3

    - by Max
    Let me rephrase my question as I investigated further: The problem: I have a php script that is used to upload images on my windows webserver 2008. The files are created in the correct directory. The are created and owned by the user Network Service. Network Service has full access to the uploaded file. As soon as I try to access the uploaded file (mostly an image) via HTTP, I get an 401.3 not authorized error. Now, if I right-click on the not accessible image and grant IIS_IUSRS group read permissions via the security tab, the image can be accessed! By default IIS_IUSRS has NO access at all for the uploaded file. The directory containing the image files has the correct access rights set. But each file that is new uploaded to the directory is permitted for IIS_IUSRS. The question: How can I grant IIS_IUSRS by default access to the newly uploaded file? The appPool of the website has its identity set to its default, I also tried setting it to "networkIdentity" or so, but that did not work either.

    Read the article

  • Why can I not edit, delete directories inside of this directory

    - by user43053
    Hello there, First, I thought this was PHP related, but maybe it isn't. My original post, which may be irrelevant now is located at the bottom. The problem is I have a directory : /articles/. In it are 10 sub directories. I have been changing the permissions lately, but now it seems all the permissions of the parent folder, sub-folders and files are either chmod 755 or 777. I cannot move, delete or edit files inside of this parent directory or sub-directories with my FTP-client. I can however edit, delete, create new files and directories and change them with PHP-functions without problems. What may the problem be? OLD POST. Ignore everything below this line: If I create a directory with mkdir(), or create a file with fopen(), file_put_contents() or SimpleXMLElement::asXML(), I am unable to access the file with my FTP-client or c-Panel File Manager. If I try to delete or edit them, I get errors. Dreamweaver suggests it is a permission problem or a network or filesystem fault (but I've set the permissions with chmod() to 0777, and when I check the cPanel, it confirms chmod 777. I also tried to use fileowner() and the function returns int(99), the same owner as those files that I could access with my FTP-client. It seems files and directories created with PHP can only be modified or be deleted with PHP. I thought this must be a server setup related issue, so I write it here. I am on a shared server, and I have no idea about setting up servers. EDIT: It seems the problem is different. I cannot move files with FTP-client to the parent, or sub-directories either. This problem may not be PHP related, then. It seems the problem applies to any directory, regardless of whether it was created by PHP. EDIT 2: The parent directory has chmod 755. Thank you for your time. Kind regards Marius

    Read the article

  • Microsoft Word 2008 on the Mac sometimes "Disappears" documents, really.

    - by Ross Charette
    This happens in a computer lab environment, has happened at least 3 times. We are running Microsoft Office 2008 for mac on Leopard, everything is updated. Our user's home directories are on a network drive, but the /Library/Cache folder is running locally. Typically a student will have a Word file that they have been working on, it's been saved before they even logged onto the computer that day. They log on, open the document, click the save icon (not go to File Save), sometimes even save multiple times, then close Word. The document is now gone. It's not hidden, there are no autosaves or anything in the Cache folder. Definitely not in the trash or trashes folder. It can't find it when you click on it in 'recent documents'. Searching meticulously though every folder in their home drive turns up nothing. They look using Finder, I look ssh'd as root into their home using ls -la. I look for similar files in case they renamed it by mistake. It's gone. Disappeared. Vaporized. It's happened to at least 3 different users in the past year. Much whining. Any idea?

    Read the article

  • Win7 loses connection to network shares after resume unless server specified using FQDN

    - by Szonja Zemkó
    My Win7 client has a connection to a Linux server and its shared folders. The problem occurs when the computer wakes up after a sleep and then one of the shared folders is not accessible. I receive the following message: Error code: 80070035, The network path was not found. I have problem with one specific folder only. When I restart the computer this problematic folder is accessible again. When I log off before sleep the folder is accessible after wakeup. If I try to access the folder by using the FQDN of the server or the server IP it is also accessible. As a temporary solution I mapped the folder to a network drive using the FQDN and it's working fine but it's inconvenient since every other folder is accessible on the server. To summarize: \server\problematicshare no longer works after resume (the Samba server sees my client connect, then disconnects a few seconds later while I receive the above error message) \server\othershare works after resume \fqdn.of.server\problematicshare always works \ip.of.server\problematicshare always works once the problem manifests, I'm no longer able to restart the "Workstation" service (it is not responding) restarting the "Computer Browser" service has no apparent effect the event log doesn't contain anything that seems relevant "ping server" works

    Read the article

  • Cannot connect to a shared network drive

    - by dublintech
    I am using windows 7, I cannot connect to a shared network drive on another machine. I can ping the machine. I can remote desktop connect to the machine. The machine is on the same subnet My friend with the exact same laptop as me (and on the same network, same workgroup) can connect to the shared folder. The machine I am trying to connect to and my friends machine can both see shared folders on my machine. I also cannot see shared folders on the friends laptop. When I select diagnose, windows tells me nothing useful. When I select see details on the error pop up, I see: Error code: 0x80004005 (google doesn't help much) I can nbtstat -a the machine who has the shared folder. When I try with my firewall turned off the same happens. I have ensured my windows 7 has all updates. I run security essentials to ensure my laptop is clean. I run ccleaner to clean up my registry. Same error. I have tried with my laptop on both wireless and ethernet. As you can imagine, I am banging my head against the wall on this one.

    Read the article

  • I have just created a subnet for a local network, connecting to a standalone server on another network, now I cannot connect to the internet

    - by Seth
    I am just learning some new aspects of servers and networking. We have a network of 5 subnets that all interconnect with each-other. In order to get two computers on the subnet that we were setting up, I changed the IP from the subnet where the standalone server is on (where they used to be set up)to the local subnet we are remotely hooking up. Likewise I also changed the gateway to coincide with the new subnet. Only problem is that since doing this, I am unable to establish a connection to the internet. I can ping the server and correspong gateway & DNS server, but cannot get connected to the internet. We do have a dumb-switch (non-programmable) connected that receives both the internet and private network inputs and distributes (or should do so) to about 5 other computers. Bottom line, I cannot currently connect to the internet, and am wondering what could be causing this.. It is likely something very obvious and pardon me being more vague than I probably should be, but I could use some help resolving this! Thanks for any help!

    Read the article

  • What Simple Changes Made the Biggest Improvements to Your Delphi Programs

    - by lkessler
    I have a Delphi 2009 program that handles a lot of data and needs to be as fast as possible and not use too much memory. What small simple changes have you made to your Delphi code that had the biggest impact on the performance of you program by noticeably reducing execution time or memory use? Thanks everyone for all your answers. Many great tips. For completeness, I'll post a few important articles on Delphi optimization that I found. Before you start optimizing Delphi code at About.com Speed and Size: Top 10 Tricks also at About.com Code Optimization Fundamentals and Delphi Optimization Guidelines at High Performance Delphi, relating to Delphi 7 but still very pertinent.

    Read the article

  • How do I load the Oracle schema into memory instead of the hard drive?

    - by Andrew
    I have a certain web application that makes upwards of ~100 updates to an Oracle database in succession. This can take anywhere from 3-5 minutes, which sometimes causes the webpage to time out. A re-design of the application is scheduled soon but someone told me that there is a way to configure a "loader file" which loads the schema into memory and runs the transactions there instead of on the hard drive, supposedly improving speed by several orders of magnitude. I have tried to research this "loader file" but all I can find is information about the SQL* bulk data loader. Does anyone know what he's talking about? Is this really possible and is it a feasible quick fix or should I just wait until the application is re-designed?

    Read the article

  • How to re-use a thread in Java ?

    - by David
    I am a building a console Sudoku Solver where the main objective is raw speed. I now have a ManagerThread that starts WorkerThreads to compute the neibhbors of each cell. So one WorkerThread is started for each cell right now. How can I re-use an existing thread that has completed its work? The Thread Pool Pattern seems to be the solution, but I don't understand what to do to prevent the thread from dying once its job has been completed. ps : I do not expect to gain much performance for this particular task, just want to experiment how multi-threading works before applying it to the more complex parts of the code. Thanks

    Read the article

  • Are Symphony and CakePHP too slow to be usable?

    - by Aziz Light
    Until now, I have always said that CakePHP is too bloated and slow. I don't really know that, I just saw "some" benchmarks. What I really want to know, is that if those two frameworks (Symfony and CakePHP) are too slow to be usable in a way that the user will get frustrated. I already know that those frameworks are slower than other alternatives, but that's not the question. I ask the question because I want to create a project management web application and I still hesitate between a couple frameworks. I've had some trouble learning Zend, but imho I haven't tried hard enough. So in conclusion, in addition to the first question above, I would like to ask another question: If I want to create a project management tool (which is a pretty big project), which of the following should you suggest, considering the developement time, the speed of the resulting application, and the robustness of the final product: Symphony CakePHP Zend Framework Also I should mention that I don't know any of those frameworks, and that I want to learn one of them (at least).

    Read the article

  • How can a large number of developers write software together without either a cumbersome process or

    - by Mark Robinson
    I work at a company with hundreds of people writing software for essentially the same product. The quality of the software has to be high because so many people depend on it (not least the developers themselves). Because of this every major issue has resulted in a new check - either automated or manual. As a result the process of delivering software is becoming ever more burdensome. So that requires more developers which... well you can see it is a vicious circle. We now have a problem with releasing software quickly - the lead time even to change one line of code for a very serious issue is at least one day. What techniques do you use to speed up the delivery of software in a large organization, while still maintaining software quality?

    Read the article

  • Document Similarity: Comparing two documents efficiently

    - by seanieb
    I have a loop that calculates the similarity between two documents. It collects all the tokens in a document and their scores, and places them in dictionary. It then compares the dictionaries This is what I have so far, it works, but is super slow: # Doc A cursor1.execute("SELECT token, tfidf_norm FROM index WHERE doc_id = %s", (docid[i][0])) doca = cursor1.fetchall() #convert tuple to a dictionary doca_dic = dict((row[0], row[1]) for row in doca) #Doc B cursor2.execute("SELECT token, tfidf_norm FROM index WHERE doc_id = %s", (docid[j][0])) docb = cursor2.fetchall() #convert tuple to a dictionary docb_dic = dict((row[0], row[1]) for row in docb) # loop through each token in doca and see if one matches in docb for x in doca_dic: if docb_dic.has_key(x): #calculate the similarity by summing the products of the tf-idf_norm similarity += doca_dic[x] * docb_dic[x] print "similarity" print similarity I'm pretty new to Python, hence this mess. I need to speed it up, any help would be appreciated. Thanks.

    Read the article

  • Why is my computer not showing a speedup when I use parallel code?

    - by Jared P
    So I realize this question sounds stupid (and yes I am using a dual core), but I have tried two different libraries (Grand Central Dispatch and OpenMP), and when using clock() to time the code with and without the lines that make it parallel, the speed is the same. (for the record they were both using their own form of parallel for). They report being run on different threads, but perhaps they are running on the same core? Is there any way to check? (Both libraries are for C, I'm uncomfortable at lower layers.) This is super weird. Any ideas?

    Read the article

  • How to decomment an html/php webpage?

    - by Sam
    A crazy question: Imagine a webpage file called somepage.php And it contains some html php contents in my editor I see: <html><head></head><body> <?=$welcome . $essay . $thatsAllForNowFolks . $footer ?> <!-- Blue Ball Bell Blow Bows Bats Beef Bark Bill Boss --> </body></html> When I browse my site I see those comments in the final result, while I only want that comment to be only in my editor for my secretive inspirations and don't want the whole world to know what I'm thinking when I'm developing, as well as I see those comments for any and all my website visitors as wasted bandwitch of internet speed. How do I decomment my entire html/php files at the moment the html is served? Ideas, code and suggestions are much appreciated. My thanks in advance...

    Read the article

  • How does git save space and is fast at the same time?

    - by eSKay
    I just saw the first git tutorial at http://blip.tv/play/Aeu2CAI How does git store all the versions of all the files and still be more economical in space than subversion which saves only the latest version of the code? I know this can be done using compression but that would be at the cost of speed, but this also says that git is much faster (though where is gains the max is the fact that most of its operations are offline). So, my guess is that git compresses data extensively it is still faster because uncompression + work is still faster than network_fetch + work Am I correct? even close?

    Read the article

  • PHP fastest method of reading server response

    - by Peter John
    Hi there, im having some real problems with the lag produced by using fgets to grab the server's response to some batch database calls im making. Im sending through a batch of say, 10,000 calls and ive tracked the lag down to fgets causing the hold up in the speed of my application as the response for each call needs to be grabbed. I have found this thread http://bugs.php.net/bug.php?id=32806 which explains the problem quite well, but hes reading a file, not a server response so fread could be a bit tricky as i could get part of the next line, and extra stuff which i dont want. Any help much appreciated!

    Read the article

  • Does any faster centralized version control than SVN exists?

    - by Savageman
    Hello, I've been using SVN since a long time and now we're trying on Git. I'm not talking on the centralized / decentralized debate here. My only concern is speed. The latter tool is much faster. But sometimes, I NEED to work with a centralized approach, which is much more simple and less complex than the decentralized one. The learning curve is really fast, which saves a lot of time (while digging into decentralized would lead to a waste of time, given the learning curve is much longer and we encounter more problem when working with it). However, SVN is really slow compared to GIT, and I don't think it has anything to do with the centralized argument. Decentralized systems also have to deal with server connections and file transfert. So I can easilly imagine a faster implementation of centralized version control could exists. Does someone has any clue on this?

    Read the article

  • Can I get a faster output pipe than /dev/null ?

    - by naugtur
    Hi I am running a huge task [automated translation scripted with perl + database etc.] to run for about 2 weeks non-stop. While thinking how to speed it up I saw that the translator outputs everything (all translated sentences, all info on the way) to STDOUT all the time. This makes it work visibly slower when I get the output on the console. I obviously piped the output to /dev/null, but then I thought "could there be something even faster?" It's so much output that it'd really make a difference. And that's the question I'm asking You, because as far as I know there is nothing faster... (But I'm far from being a guru having used linux on a daily basis only last 3 years)

    Read the article

< Previous Page | 125 126 127 128 129 130 131 132 133 134 135 136  | Next Page >