Search Results

Search found 21644 results on 866 pages for 'connection speed'.

Page 435/866 | < Previous Page | 431 432 433 434 435 436 437 438 439 440 441 442  | Next Page >

  • Shortcut with arguments in Debian

    - by Duncan
    I have a volume on a debian server which contains a large number of images at full resolution in various folders. What I'd like to do is have a separate sort of browse proxy folder which contains lower quality browse copies of these to enable users to access them for viewing over lower speed dial in accounts. I'd ideally like these to be created on the fly using ImageMagick so there isnt the need to store the large number of browse copies full time and worry about keeping them up to date etc The way I'd invisaged this happening is the browse proxy folder containing a duplicate file and folder structure but with symlinks pointing to a script to transform them with the file path as an argument. Except I know this isnt possible with symlinks so am wondering if there's another way of doing this on linux. On windows shortcuts can take arguments and I'm wondering how to do the same on a Linux platform? (or perhaps I'm going about this the wrong way?)

    Read the article

  • Pushing image changes to multiple servers

    - by gms8994
    I need the ability to push images out to multiple servers whenever they're updated. I've looked at Network Filesystems, but they're all but worthless due to their speed. Images can be uploaded to any one of 3 servers, and would then need to be copied to the other 2. Any suggestions? I'm open to try just about anything. EDIT: Graphics data (jpg, gif, png, etc). Linux only. We're currently using rsync. But having it work back and forth is getting cumbersome. It's all local network.

    Read the article

  • Installing WiFi in a church for internet access

    - by Dave Griffiths
    Hi, a colleague at work asked this question: My local church want to get WiFi enabled in the whole of their offices and main church building. The offices and main building are in the same overall structure and so the signal does not need to travel outside to any other buildings outside this main structure. Where would I start on such a project and what kind of repeaters are needed (and how are they set up) given that some of the walls are pretty thick. Do you still have one ADSL line coming in to router or are there some options of having two lines coming in so that they isn't a single point of failure. Am looking at fibre for the broadband for increased speed. Any starting pointers appreciated.

    Read the article

  • can I connect two modems and comunicate on two side

    - by GongT
    I just wondering how modems work. I know the process of "modulation" and "demodulation". So I wanto know is "demodulation" are simple revese of "modulation" in real world. The PC can got an IP address when connect as type 1 What will happen when I connect them like type 2? type 1 : [PC] ================= [router] type 2 : [PC] === [m] ------- [m] === [router] [m] : modem(exactly same) === : Ethernet cable --- : DSL cable (phone line, maybe optical fiber, or something else?) ISP has a large number of model, Is them same thing as the one in my home(but with diffrent size/speed/price...etc)? Or it's completely different thing?

    Read the article

  • My laptop keeps hard resetting

    - by cgoddard
    I have had this laptop for a long time (five years at least) and it hasn't had the best treatment over the years. But over the last few months, it has been randomly completely shutting down, no blue screening, no steady shutdown, just black screen, then BIOS. I think it might be happening as I do a save, but cant be sure, but recently it has been getting very annoying. I did initially think it was an overheating issue, but the speed that it comes back online is staggeringly fast for it to have cooled down sufficiently. plus it has got incredibly hot before with no issue (since its been doing this). Does anyone know what might be going on? Dell Inspiron 640m, running Windows 7.

    Read the article

  • How does storage spaces decide where to put my files?

    - by George Duckett
    With Windows 8 Storage Spaces, you can lump many hard disks of varying types, speeds and sizes together to use as a single storage space / logical drive. How does Windows decide what to place where? For example will it move files about depending on frequency of access? Maybe splitting files frequently accessed together between hard disks etc. What does it optimize for? Speed, reliability, etc? If the above is asking too much, can I easily see where the files are physically (on which physical disk)?

    Read the article

  • Why does the heat production increase as the clockrate of a CPU increases?

    - by Nils
    This is probably a bit off-topic, but the whole multi-core debate got me thinking. It's much easier to produce two cores (in one package) then speeding up one core by a factor of two. Why exactly is this? I googled a bit, but found mostly very imprecise answers from over clocking boards which do not explain the underlying Physics. The voltage seems to have the most impact (quadratic), but do I need to run a CPU at higher voltage if I want a faster clock rate? Also I like to know why exactly (and how much) heat a semiconductor circuit produces when it runs at a certain clock speed.

    Read the article

  • Properly handling NSURLConnection errors

    - by Cal S
    Hi, I have a simple form interface set up that send username and password information to a server: (working) NSString *postData = [NSString stringWithFormat:@"user=%@&pass=%@",[self urlEncodeValue:sysUsername],[self urlEncodeValue:password]]; NSLog(@"Post data -> %@", postData); /// NSData* postVariables = [postData dataUsingEncoding:NSASCIIStringEncoding allowLossyConversion:YES]; NSMutableURLRequest* request = [[[NSMutableURLRequest alloc] init] autorelease]; NSString* postLength = [NSString stringWithFormat:@"%d", [postVariables length]]; NSURL* postUrl = [NSURL URLWithString:@"http://localhost/~csmith/cocoa/test.php"]; [request setURL:postUrl]; [request setHTTPMethod:@"POST"]; [request setValue:postLength forHTTPHeaderField:@"Content-Length"]; [request setValue:@"application/x-www-form-urlencoded" forHTTPHeaderField:@"Content-Type"]; [request setHTTPBody: postVariables]; NSData *returnData = [NSURLConnection sendSynchronousRequest:request returningResponse:NULL error:NULL]; NSLog(@"Post data SENT & returned -> %@", returnData); How do I handle connection errors such as no internet connection, firewall, etc. Also, does this method use the system-wide proxy settings? Many of my users are behind a proxy. Thanks a lot!

    Read the article

  • Download resume support blocked by isp?

    - by John Doe
    Can ISP block resume support for downloads ? I'm using IDM (internet download manager) to download of the internet from resume supported websites, yet I am unable to resume downloads. I tried different computers with the same result. Turned off firewall, didn't have any effect i was able to download with no issues until a couple of days ago. Another thing i noticed is that before IDM used to try to connect to several connections to speed up my download, but now it can only connect to one connection. Also i tried to download using my vpn, and i was able to download and resume downloads with no problem.

    Read the article

  • Utilising a Magento server cluster to drive hot reindexing

    - by WOBenji
    We've asked a similar question in the past, basically we have a very large Magento store with 500000 products which are currently reindexed once a day, during the night. We'd like to speed this process up significantly, we're at about 4-5 hours now. The solution was suggested for us to do something like this on a server cluster and replicate the database changes after they've been done on a machine that isn't being bothered with serving customers. But what is the mechanism for that? How do we replicate those changes across to the live site from the server cluster? Can someone point me in the right direction here?

    Read the article

  • Adding expire headers to content served from CDN?

    - by mdolon
    I'm using MaxCDN to serve content to my blog using W3 Total Cache. The problem I'm running into when evaluating my site using Google Page Speed and YSlow! is that expire headers are not being sent on content delivered from the CDN, nor are they coming from a cookieless domain. Is this something that is completely in the hands of my CDN or is it something I can fix using my server configuration? Some info about my setup: nginx with php-fastcgi wordpress 3.0 w3 total cache 0.9a (dev release) MaxCDN the site: http://devgrow.com/

    Read the article

  • Python client / server question

    - by AustinM
    I'm working on a bit of a project in python. I have a client and a server. The server listens for connections and once a connection is received it waits for input from the client. The idea is that the client can connect to the server and execute system commands such as ls and cat. This is my server code: import sys, os, socket host = '' port = 50105 s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) s.bind((host, port)) print("Server started on port: ", port) s.listen(5) print("Server listening\n") conn, addr = s.accept() print 'New connection from ', addr while (1): rc = conn.recv(5) pipe = os.popen(rc) rl = pipe.readlines() file = conn.makefile('w', 0) file.writelines(rl[:-1]) file.close() conn.close() And this is my client code: import sys, socket s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) host = 'localhost' port = input('Port: ') s.connect((host, port)) cmd = raw_input('$ ') s.send(cmd) file = s.makefile('r', 0) sys.stdout.writelines(file.readlines()) When I start the server I get the right output, saying the server is listening. But when I connect with my client and type a command the server exits with this error: Traceback (most recent call last): File "server.py", line 21, in <module> rc = conn.recv(2) File "/usr/lib/python2.6/socket.py", line 165, in _dummy raise error(EBADF, 'Bad file descriptor') socket.error: [Errno 9] Bad file descriptor On the client side, I get the output of ls but the server gets screwed up.

    Read the article

  • Slow network file copy on Windows 7

    - by Jason V.
    I wrote a program that uses xcopy to transfer files (usually between 1KB and 2MB) over our intranet. Usually, I am copying files from my host machine (Windows 7 x64) to a VMWare virtual machine running Windows Server 2008 (the VM is running on my host machine, if that matters). On Windows XP, the file transfers usually only require a few seconds to complete. But on my Windows 7 machine, the transfer of the first file (1.5 MB) takes around 1.5 minutes to complete. This is true whether I use xcopy, robocopy, or programmatically using File.Copy(). I noticed that if I use File.Copy, the first transfer is very slow and subsequent transfers are much faster. Any clue how I can speed up the process? Is there a setting in Windows 7 (or server 2008) that I could try?

    Read the article

  • Make a PHP GET request from a PHP script and exit

    - by Abs
    Hello all, Is there something simpler than the following. I am trying to make a GET request to a PHP script and then exit the current script. I think this is a job for CURL but is there something simpler as I don't want to really worry about enabling the CURL php extension? In addition, will the below start the PHP script and then just come back and not wait for it to finish? //set GET variables $url = 'http://domain.com/get-post.php'; $fields = array( 'lname'=>urlencode($last_name), 'fname'=>urlencode($first_name) ); //url-ify the data for the GET foreach($fields as $key=>$value) { $fields_string .= $key.'='.$value.'&'; } rtrim($fields_string,'&'); //open connection $ch = curl_init(); //set the url, number of POST vars, POST data curl_setopt($ch,CURLOPT_URL,$url); curl_setopt($ch,CURLOPT_GET,count($fields)); curl_setopt($ch,CURLOPT_GETFIELDS,$fields_string); //execute GET $result = curl_exec($ch); //close connection curl_close($ch); I want to run the other script which contains functions when a condition is met so a simple include won't work as the if condition wraps around the functions, right? Please note, I am on windows machine and the code I am writing will only be used on a Windows OS. Thanks all for any help and advice

    Read the article

  • Mysql skip-name-resolve

    - by user72459
    I had recently purchased a new server, and transferred all my accounts via WHM Transfer. The problem is that when WHM takes a daily backup, it outputs are message such as DBD::mysql::st execute failed: There is no such grant defined for user 'abc' on host 'myhostname' The problem is solved when I remove skip-name-resolve from my my.cnf file. Tough I dont find any differences in the speed (when I dont add it), it is often mentioned in forums that adding skip-name-resolve optimizes Mysql Performance. Does adding skip-name-resolve really help, if one has a Dedicated server?

    Read the article

  • For very beginning startup: home server or EC2?

    - by StCee
    The micro instance of Amazon EC2 only has ram of 613MB, my laptop got 8GB. And I suppose likewise the processing power of my computer would be better than the micro instance. My question is, what are the considerations in deciding to host yourself or on Amazon EC2, especially for a really baby startup? For example, would network speed be a problem? My computer broadband network is 100Mbs up to 1Gbs. What would Amazon compare to this? My site at these moment would just host some images and perform some php requests. I would probably also use cloudflare but seems it increases the dns lookup time considerably... And of course the overall objective is to make the best user experience.

    Read the article

  • Can one thread open a socket and other thread close it?

    - by Pkp
    I have some kernel threads in Linux kernel, inside my KLM. I have a server thread, that listens to the channel, Once it sees there is an incoming connection, it creates an accept socket, accepts the connection and spawns a child thread. It also passes the accepted socket to the child kernel thread as the (void *) argument. The code is working fine. I had a design question. Suppose now the threads have to be terminated, main and the child threads, what would be the best way to close the accept socket. I can see two ways, 1] The main thread waits for all the child threads to exit, each of the child threads close the accept sockets while exiting, the last child thread passes a signal to the main thread for it to exit . Here even though the main thread was the one that created the accept socket, the child threads close that socket, and they do this before the main thread exits. So is this acceptable? Any problems you guys forsee here? 2] Second is the main thread closes all the accept sockets it created before it exits. But there may be a possibility(corner case) that the main thread gets an exception and will have to close, so if it closes the accept sockets before exiting, the child threads using that socket will be in danger. Hence i am using the first case i mentioned.Let me know what you guys think?

    Read the article

  • How can I check that I'm not frying the chipset on my 945 gclf?

    - by Journeyman Geek
    I'm running an old 945 gclf as a home server. Its an odd little system which has a passively cooled single core atom 230 processor, and a small heatsink/fan combo over the chipset. lm-sensors/sensors detects one temperature sensor which I guess is the processor. coretemp-isa-0000 Adapter: ISA adapter Core 0: +38.0°C (crit = +100.0°C) The fan on it is making rather horrible noises and I am considering unplugging it or using a speed reducer. There's also a risk it may fail anyway. Is there any way I can monitor the temperatures of the chipset or be warned of any issues with it? Is there anything specific I can watch out for that would indicate trouble?

    Read the article

  • Create SQL parameters programmatically

    - by Neo
    Another annoying one for me but probably something simple. I have a number of possible where clauses for a query based on user input, my question is how can I add these programmatically? For instance: wherequery = @"WHERE fieldname = @p_FieldName AND "; if (txtValue.textLength > 0){ wherequery += "fieldname2 = @p_FieldName2 AND "; } query = @"SELECT * FROM tabe" + wherequery; sql = connection.CreateCommand(); sql.CommandText = query; How would I go about doing the parameters for that? I've tried ArrayLists, Dictionaries and a few other methods but can't find a way of doing it. Ideally I'd want to do something like this: SqlParameter[] sqlparams; wherequery = @"WHERE fieldname = @p_FieldName AND "; if (txtValue.textLength > 0){ wherequery += "fieldname2 = @p_FieldName2 AND "; sqlparams.Parameters.Add("@p_FieldName2 ", SqlDbType.VarChar).Value = txtValue.text; } query = @"SELECT * FROM tabe" + wherequery; sql = connection.CreateCommand(); sql.CommandText = query; sql.Parameters.Add(sqlparams);

    Read the article

  • Windows Server: Do I really need servers in remote locations?

    - by IMAbev
    I have one main site with several servers an a 2008/2012 environment. I have 4 remote sites that are physically close (a few miles apart) and are all connected to the main site by 20meg fiber on a private network. At each of the remote locations I have a windows server that users log in to and where their files and apps are located. There are many considerations to answering this question. But the first thing I am wondering is do I really need a server at each location? Users are just logging in to this server for permissions and a vast majority of my users are only using word, excel and email. I am really interested in figuring out if I need servers at these locations. $3,000 to $4,000 per server every 3-5 years, licensing, administration... I know there are other considerations - speed, redundancy, if my link to the main site goes down the users have nothing. But I just am not convinced I need servers at these locations.

    Read the article

  • Prioritize file sharing capabilities in Windows Server 2008

    - by cmbrnt
    I've got a server running Windows Server 2008, and use it mainly for sharing files throughout the domain from a number of disks. It's running on VMware ESXi 4.0, in case that matters. My problem is that when I log in to the server to check user permissions etc, the access speed the files on the remote disks almost grinds to a halt. I havn't been able to measure the speeds, but I would guess it slows down to about 100kB/s as soon as I log in. This is on a gigabit network and the problems are equal for all users, even the ones connected to the same switch as the server. I've assigned 2 GB RAM to the server, and reserved it 1,5Ghz processor power. I don't have to do anything special on the server for this halt to occur. How can I make sure file sharing is prioritized on the server, so no matter what applications I'm using it will always make sure file sharing works properly? Could this be a VMware issue?

    Read the article

  • How to set the ActiveMQ redeliveryPolicy on a queue?

    - by edbras
    How do I set the redeliveryPolicy in ActiveMQ on a Queue? 1) In the doc, see: activeMQ Redelivery, the explain that you should set it on the ConnectionFactory or Connection. But I want to use different value's for different Queue's. 2) Apart from that, I don't seem to get it work. Setting it on the connection factory in Spring (I am using activemq 5.4.2. with Spring 3.0) like this don't seem to have any effect: <amq:connectionFactory id="amqConnectionFactory" brokerURL="${jms.factory.url}" > <amq:properties> <amq:redeliveryPolicy maximumRedeliveries="6" initialRedeliveryDelay="15000" useExponentialBackOff="true" backOffMultiplier="5"/> </amq:properties> </amq:connectionFactory> I also tried to set it as property on the defined Queue, but that also seem to be ignored as the redelivery occurs sooner that the defined values: <amq:queue id="jmsQueueDeclarationSnd" physicalName="${jms.queue.declaration.snd}" > <amq:properties> <amq:redeliveryPolicy maximumRedeliveries="6" initialRedeliveryDelay="15000" useExponentialBackOff="true" backOffMultiplier="5"/> </amq:properties> </amq:queue> Thanks

    Read the article

  • Some explaination on WLAN adaptor antennas and radios

    - by gert_78
    We have a desktop in a room where the is no wired connection that has bad reception from the wireless AP. Someone (I don't remember who of course) told me there are high gain antennas that will make it possible to get a better reception (and throughput). I believe he said they are adaptors with "high gain" antennas. He also told me he uses it in hotels when he has a bad WLAN reception. When he connects that card the reception and speed is a lot better all of a sudden. But can someone explain me in understandable language what that is and what the thing with high gain, dB, dBi and mW is all about please? What type of card do we need? On like this or this?

    Read the article

  • Best Approach for Checking and Inserting Records

    - by nevets1219
    In one of our existing C programs which purpose is: Open connection to DB for record in all_record: if record contain certain data: if record is NOT in table A: // see #1 insert record information into table A and B // see #2 Close connection to DB select field from table where field=XXX 2 inserts This is typically done every X months to sync everything up or so I'm told. I've also been told that this process takes roughly a couple of days. There is (currently) at most 2.5million records (though not necessarily all 2.5m will be inserted). One of the table contains 10 fields and the other 5 fields. There isn't much to be done about iterating through the records since that part can't be changed at the moment. What I would like to do is speed up the part where I query MySQL. I'm not sure if I have left out any important details -- please let me know! I'm also no SQL expert so feel free to point out the obvious. I thought about: Putting all the inserts into a transaction (at the moment I'm not sure how important it is for the transaction to be all-or-none or if this affects performance) Using Insert X Where Not Exists Y LOAD DATA INFILE (but that would require I create a (possibly) large temp file) I read that (hopefully someone can confirm) I should drop indexes so they aren't re-calculated. mysql Ver 14.7 Distrib 4.1.22, for sun-solaris2.10 (sparc) using readline 4.3

    Read the article

  • vb Syntax error in INSERT INTO statement

    - by user201806
    im new in vb, i was create a program to connection ms access but when i run the program it get syntax error in Insert into statement, OleDbExpection was unhandled here my code Public Class Form2 Dim cnn As New OleDb.OleDbConnection Private Sub Form2_Load(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles MyBase.Load txtdate.Value = DateTime.Now cnn = New OleDb.OleDbConnection cnn.ConnectionString = "Provider=Microsoft.Jet.Oledb.4.0; Data Source=C:\Users\John\Documents\db.mdb" End Sub Private Sub btnsave_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles btnsave.Click If Not cnn.State = ConnectionState.Open Then cnn.Open() End If Dim cmd As New OleDb.OleDbCommand cmd.Connection = cnn cmd.CommandText = "INSERT INTO sr(names,add,tel,dates,prob,serv,model,snm,acc,sna,remark)" & _ "VALUES ('" & Me.txtname.Text & "','" & Me.txtadd.Text & "','" & Me.txttel.Text & "', '" & _ Me.txtdate.Text & "','" & Me.txtpro.Text & "','" & Me.txtser.Text & "','" & Me.txtmod.Text & "', '" & _ Me.txtsnm.Text & "','" & Me.txtacc.Text & "','" & Me.txtsna.Text & "','" & Me.txtrem.Text & "')" cmd.ExecuteNonQuery() cnn.Close() End Sub End Class it's there any wrong with my code?

    Read the article

< Previous Page | 431 432 433 434 435 436 437 438 439 440 441 442  | Next Page >