Search Results

Search found 6355 results on 255 pages for 'slow downs'.

Page 148/255 | < Previous Page | 144 145 146 147 148 149 150 151 152 153 154 155  | Next Page >

  • Backup software for Ubuntu - which one?

    - by Industrial
    Hi everybody, I have spent some time testing out different backup solutions for my small home office during the last weeks, but still haven't found anything that have been working out too well yet. We can definitely work with a non-GUI script if that's what it takes, if only the requirements are fulfilled: Upload to Amazon S3 Europe. We get unbelievable slow uploading speed to US, so uploading 400+ GB of data will not be happening anytime this year... Incremental backups - only changed files shall be uploaded or we will have a big bill from Amazon in the end of each month.. Files should not be uploaded in one big per-folder archive. This is not efficient at all, since if we change one file in a subfolder, a huge two-digit GB sized file would have to be uploaded during next backup. Not good for economy again, or traffic overhead on our internet connection. What options are available to us? Thanks!

    Read the article

  • CSS- removing horizontal space in list menu using display inline property

    - by Kayote
    Hi All, Im new to CSS and have a set target of learning & publishing my website in CSS by the end of the month. My question: Im trying to build a CSS horizontal menu with hover drop downs, however, when I use the 'display: inline' property with li (list) items, I get horizontal spaces between the li (list) items in the bar. How do I remove this space? Here is the html: <div id="tabas_menu"> <ul> <li id="tabBut0" class="tabBut">Overview</li> <li id="tabBut1" class="tabBut">Collar</li> <li id="tabBut2" class="tabBut">Sleeves</li> <li id="tabBut3" class="tabBut">Body</li> </ul> </div> And here is the CSS: #tabas_menu { position: absolute; background: rgb(123,345,567); top: 110px; left: 200px; } ul#tabas_menu { padding: 0; margin: 0; } .tabBut { display: inline; white-space: list-style: none; background: -webkit-gradient(linear, 0% 0%, 0% 100%, from(rgba(255,142,190,1)),to(rgba(188,22,93,1))); background: -moz-linear-gradient(top, rgba(255,142,190,1), rgba(188,22,93,1)); font-family: helvetica, calibri, sans-serif; font-size: 16px; font-weight: bold; line-height: 20px; text-shadow: 1px 1px 1px rgba(99,99,99,0.5); -moz-border-radius: 0.3em; -moz-box-shadow: 0px 0px 2px rgba(0,0,0,0.5); -webkit-border-radius: 0.3em; -webkit-box-shadow: 0px 0px 2px rgba(0,0,0,0.5); padding: 6px 18px; border: 1px solid rgba(0,0,0,0.4); margin: 0; } I can get the space removed using the 'float: left/right' property but its bugging me as to why I cannot achieve the same effect by just using the display property.

    Read the article

  • Route gaming data over wireless and everything else through LAN?

    - by Alex
    I have two internet connections available to me. One is via LAN.. not a great ping, but fast downloads. The other is via USB wireless adapter.. good ping, but slow downloads. I want to connect to both of them simultaneously. I want to be able to specify which data or application will use the wireless connection and route everything else through the lan connection. Is this possible, and how would I do it? Windows 7 x64 is my operating system. Here is the data from route print: http://pastebin.com/vsjQRpSM I'm still unsure of how to use this to make all of my data go through the nvidia lan interface, even after reading route /? Also, if I'm able to achieve that, will it override the ForceBindIP?

    Read the article

  • How can I turn off flash fill automatically in Excel 2013?

    - by user3480643
    Flash fill breaks a lot of things in older excel documents. It causes maddeningly slow transfers from cell to cell after updating. I am trying to find a way to turn off "flash fill" in Excel 2013 automatically before rolling the product out to the rest of the staff in my company. Is there (preferably) a registry key that I can apply or a switch that I can include during the install that will turn this option off? Here is an image of the setting that I am looking to turn off: I haven't been able to find any documentation online about turning this off, other than this one page from MS: http://office.microsoft.com/en-ie/excel-help/turn-flash-fill-on-HA104043292.aspx

    Read the article

  • Searching Multiple Terms

    - by nevets1219
    I know that grep -E 'termA|termB' files allows me to search multiple files for termA OR termB. What I would like to do instead is search for termA AND termB. They do not have to be on the same line as long as the two terms exists within the same file. Essentially a "search within result" feature. I know I can pipe the results of one grep into another but that seems slow when going over many files. grep -l "termA" * | xargs grep -l "termB" | xargs grep -E -H -n --color "termA|termB" Hopefully the above isn't the only way to do this. It would be extra nice if this could work on Windows (have cygwin) and Linux. I don't mind installing a tool to perform this task.

    Read the article

  • For a particular domain, how can I cache its JSON responses locally?

    - by Chris
    I'm coding the frontend of a web app that uses XHR to grab JSON data from a 3rd party. The 3rd party service is slow and because of its API design, we need to make a LOT of API requests every time I refresh the page to test some new code. It's making the development loop painful. The requests are GETs, POSTs and PUTs even though I'm pretty sure none of the requests are changing state. I want to go to localhost for the JSON rather than to this 3rd party API - simply to make my development process faster.

    Read the article

  • Default profile for large

    - by user63434
    Hi I am setting up a master image to clone to all same machine type Windows 7 client, I login as administrastor and installed all the programs and changed the desktop settings etc, but my local administrator profile is 244megs in size, which will become the default profile of the local machine when sysprep, we have a 2003 server that I want to use mandatory profile for all login users which means I need to copy this profile to the server so when any users login to the domain they are using this profile, loading a 244megs profile is going to be very slow since it will be removed from the client when they logoff. So next time they login it will take a long time again. Is there anything I can do, can I just copy just the bare minimum files from the default profile to the server, as I am not sure what parts I need, I read that I must copy my documents, my documents/pictures so the folder redirection will work. What else do I need to copy to the server? I have firefox xmark sync also and MS words etc. THanks

    Read the article

  • Syncing 1TB+ to a iSCSI device, software needed

    - by mojah
    Hi, I need to sync a local disk to a iSCSI mount on Windows (server 2003), and I'm struggling to find software that's capable of doing so in a reasonable timeframe. Notes on the current 1TB disk: - 800GB currently in use - Contains a folder with several hundred thousand subfolders, which in turn have several thousand files, ... So I'm trying to find a piece of software that can handle such large filelists, and give me a good timeframe on when this will be copied. I've tried DeltaCopy (the rSync GUI client for Windows), but it's intolerably slow and doesn't provide me with a good estimate time remaining. DeltaCopy: http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp Does anyone know alternative software for Windows, that would do this well?

    Read the article

  • Specific apache + mysql settings for a light-weight site

    - by Good Person
    I have a small website with a Joomla and a Moodle set up. It seems that both of these are very slow. The server (CentOS release 5.5 (Final)) is a virtual dedicated server with about 2GB of ram. I don't expect to ever get more than 10-15 people on at the same time (and if that is high) What settings could I change in either apache, mysql, or even the OS to increase the performance of my site? I'm not concerned about running out of resources if I get too many visitors. If you need more specific data leave a comment and I'll edit the question.

    Read the article

  • MySql transfer / update (a bit specific)

    - by Jeff
    before posting I was digging whole site but didn't find help for my problem, so I hope someone will help... Facts: 30 Gb mysql database on remote server (about 20.000.000 rows) data are once weekly updated in local network (mysql) I need to transfer/replace local updated database with remote connection is about 2mb (real mb, not mbps) up/down Point is that I can't have 'down time' of remote mysql server. Until now I Tried: navicat data sync - Ok, but take about 3 days to finish dbForge - ok but need 5 days to finish mysql dump transfer to remote server and execution - about day, but a lot of downtime rsync folder with database /mysql/lib/MY_DATABASE - 4 hours, but after that I need to execute always 'repir on remote server' which takes about 2 hours, and a lot of down time mysql dump piped from cl to directly goto server - still now satisfied many problems I could give you more things that I tried... mysql replication - slow Anyase, what is best,best way to: refresh remote mysql on weekly level and in same time to have 0 sec down time nor huge server load If you have any idea please share

    Read the article

  • System File Checker vs Service Pack Reinstall

    - by Nixphoe
    When trying to repair slow workstations, I've found that running sfc /scannow helps quite a lot in a few of my environments running really old computers. I've also seen recommendations of reinstalling the last service pack after software installation to help keep the system stable. That makes sense as it would replace a lot of the dll files with the ones that would come with the service pack. They both seem to do the same thing, but SFC some times will ask for a disk, where the Service Pack will not. What is the main difference between the two?

    Read the article

  • What causes Remote Desktop Services Manager to crash in Server 2008 R2?

    - by milkmood
    I have this consistent problem of RDSM crashing in Server 2008 R2. It is either really slow to open, sometimes never opens, or after it's been open and working properly for a bit, stops working, and forces an unload of the snap-in. It's done this since the deployment of this server, new hardware, new instance of S2k8. Domain Administrator login. I am using it to manage 3 Terminal Servers, the other two are S2k3. I've used it without issues on other 2008 servers.

    Read the article

  • How to choose NoSQL database engine?

    - by Poma
    We have a database with following specs: 30k records, 7mb in size 20 inserts/second 1000 updates/second 1000 range selects/second, by secondary index, approx 10 rows each needs at least one secondary index needs some mechanism to expire keys if they are not updated for 75 secs (can be done via programmatic garbage collector but will require additional 'last_update' index and will add some load) consistency is not required durability is not required db should be stored in memory For now we use Redis, but it does not have secondary index and it's keys index:foo:* is too slow. Membase also does not have secondary index (as far as I know). MongoDB and MySQL memory engine have table-level locks. What engine will fit our use case?

    Read the article

  • How do I stop my IIS App Pool making a request to wpad.mydomain.com?

    - by Programming Hero
    As part of some performance troubleshooting, I've monitored the slow startup of a "cold" App Pool (one without an active worker process) in IIS. When using a built-in account, the App Pool starts in sub-second time. When using a custom local account the App Pool takes 30+ seconds to start processing requests. The service appears to be making requests to wpad.mydomain.com, an address it does not have access to, which causes it to wait 30 seconds for a response before eventually timing out. As a workaround, I've added the hostname to the server's hosts file, to direct the traffic to the local machine, which returns much faster (1-2 seconds). What do I need to do to stop IIS making this request when this identity is used for the App Pool?

    Read the article

  • how to protect php app (vbulletin) from hackers

    - by samsmith
    Our vBulletin system is under constant attack, raising cpu load and making the system very slow for legit users. The attack is a script type attack that is attempting to log in and/or create new login ids (mostly it is trying to create login ids in order to spam the site). In vBulletin, we have black listed large ranges of ips, which has helped a lot, but the attacks continue. Is there an automated way to protect the application or web server? ideally, the protection would detect the pages accessed and automatically black list the ip.

    Read the article

  • MS Excel: Can I link images using a relative path?

    - by Port Islander 2009
    I am working on an MS Excel document that contains a lot of (around 200) images. They are currently saved within the document, so the file becomes huge and working gets very slow. Linking the pictures without saving them works very well - I now have the Excel document and a folder "pictures" next to it that contains all my image files. However, when I move the document and the folder to a new location, all my pictures disappear. This seems to be because Excel saves the link information as absolute paths. (Update: Actually, according to this thread, Excel stores the link information as relative paths as well. Now I really don't know why my links break down..) Is there a convenient way to save them as relative paths or have Excel automatically update the path information? Update: It's important that the images get displayed on the sheet and can be printed. I am working with Microsoft Excel for Mac 2008 and 2011. I really appreciate your help.

    Read the article

  • Solaris 10: Identify a PID and the CPU it's running on

    - by Marcus
    I have multiple instances of a database running on a Solaris system. I'd like to prove that each database process is being handled by a different CPU. Essentially, I want to be able to do something like a ps -ef | grep <process_name> to get the PIDs and then run another command (if required) to identify the CPU... Is prstat able to do this? I'm making an assumption that as each database instance is started each one uses a different CPU. I'm not sure if I'm understanding this correctly... The reason I want to do this is because Sun hardware has slow CPU's, but lots of them. Therefore, to get the best performance out of it, I need to try and spread the load among CPU's... Thanks

    Read the article

  • Solaris 10: Identify a PID and the CPU it's running on

    - by Marcus
    I have multiple instances of a database running on a Solaris system. I'd like to prove that each database process is being handled by a different CPU. Essentially, I want to be able to do something like a ps -ef | grep <process_name> to get the PIDs and then run another command (if required) to identify the CPU... Is prstat able to do this? I'm making an assumption that as each database instance is started each one uses a different CPU. I'm not sure if I'm understanding this correctly... The reason I want to do this is because Sun hardware has slow CPU's, but lots of them. Therefore, to get the best performance out of it, I need to try and spread the load among CPU's... Thanks

    Read the article

  • Mysql process goes over 100% of CPU usage

    - by Temnovit
    Hello! I'm experiencing some problems with my LAMP server. Recently, everything became very slow, even though visitor count on my websites didn't change to much. When I run top command, it sais that mysql process has taken over 150-200% of CPU. How's that possible, I always thought that 100% is a maximum? I'm running Ubuntu 9.04 server edition with 1,5 GB RAM my.cnf settings: key_buffer = 64M max_allowed_packet = 16M thread_stack = 192K thread_cache_size = 8 myisam-recover = BACKUP max_connections = 200 table_cache = 512 table_definition_cache = 512 thread_concurrency = 2 read_buffer_size = 1M sort_buffer_size = 4M join_buffer_size = 1M query_cache_limit = 1M # the maximum size of individual query results query_cache_size = 128M Here is the output of MySQLTuner: The top command: What could be the cause of this problem? Can I make changes to my my.cnf to prevent server from hanging?

    Read the article

  • Our company has 100,000s+ photos, how to store and browse/find these efficiently?

    - by tobefound
    We currently store our photos in a structure like this: folder\1\10000 - 19999.JPG|ORF|TIF (10 000 files) folder\2\20000 - 29999.JPG|ORF|TIF (10 000 files) etc... They are stored on 4 different 2TB D-link NASes attached and shared on our office network (\\nas1, \\nas2, and so on...) Problems: 1) When a client (Windows only, Vista and 7) wishes to browse the let's say \\nas1\folder\1\ folder, performance is quite poor. A problem. List takes a long time to generate in explorer window. Even with icons turned off. 2) Initial access to the NAS itself is sometimes slow. Problem. SAN disks too expensive for us. Even with iSCSI interface/switch technology. I've read a lot of tech pages saying that storing 100 000+ files in one single folder shouldn't be a problem. But we don't dare go there now that we experience problems on a 10K level. All input greatly appreciated, /T

    Read the article

  • Windows 7 system CPU bogged by windows services, no explanation

    - by Alex
    I'm looking at a laptop for a colleague which is running terribly slow. A quick look showed that the CPU was 100% used by 2-3 SVCHost processes, which off course doesn't tell much since those are just 'cover' processes with services running underneath them. So I fired up process explorer in hopes of finding a shady rogue service which was bogging the system, but to my suprise I found genuine MS Windows processes (or at least damn-good disguised ones) are bogging down the system: dnscache (DNS Client) IKEEXT (IKE and AuthIP IPSec Keyring modules) iphlpsvc (IP Helper) Seen separately, these processes might seem odd to be using a lot of CPU, but taking a step back one can conclude that all three services are quite closely related to networking. I've tried running: netsh int ip reset log.txt which has helped me save bizarre network-related problems in the past, but this didn't help Off course I though about a virus, but both MS Security Essentials as well as malwarebytes (let both run a full scan).

    Read the article

  • Optimizing JS Array Search

    - by The.Anti.9
    I am working on a Browser-based media player which is written almost entirely in HTML 5 and JavaScript. The backend is written in PHP but it has one function which is to fill the playlist on the initial load. And the rest is all JS. There is a search bar that refines the playlist. I want it to refine as the person is typing, like most media players do. The only problem with this is that it is very slow and laggy as there are about 1000 songs in the whole program and there is likely to be more as time goes on. The original playlist load is an ajax call to a PHP page that returns the results as JSON. Each item has 4 attirbutes: artist album file url I then loop through each object and add it to an array called playlist. At the end of the looping a copy of playlist is created, backup. This is so that I can refine the playlist variable when people refine their search, but still repopulated it from backup without making another server request. The method refine() is called when the user types a key into the searchbox. It flushes playlist and searches through each property (not including url) of each object in the backup array for a match in the string. If there is a match in any of the properties, it appends the information to a table that displays the playlist, and adds it to the object to playlist for access by the actual player. Code for the refine() method: function refine() { $('#loadinggif').show(); $('#library').html("<table id='libtable'><tr><th>Artist</th><th>Album</th><th>File</th><th>&nbsp;</th></tr></table>"); playlist = []; for (var j = 0; j < backup.length; j++) { var sfile = new String(backup[j].file); var salbum = new String(backup[j].album); var sartist = new String(backup[j].artist); if (sfile.toLowerCase().search($('#search').val().toLowerCase()) !== -1 || salbum.toLowerCase().search($('#search').val().toLowerCase()) !== -1 || sartist.toLowerCase().search($('#search').val().toLowerCase()) !== -1) { playlist.push(backup[j]); num = playlist.length-1; $("<tr></tr>").html("<td>" + num + "</td><td>" + sartist + "</td><td>" + salbum + "</td><td>" + sfile + "</td><td><a href='#' onclick='setplay(" + num +");'>Play</a></td>").appendTo('#libtable'); } } $('#loadinggif').hide(); } As I said before, for the first couple of letters typed, this is very slow and laggy. I am looking for ways to refine this to make it much faster and more smooth.

    Read the article

  • Performance problems - Jira running on Ubuntu over VMware ESX 4.0 maxing out all 4 vCPUs.

    - by Jack T
    We are running Jira on a box under VMware ESX 4.0 and performance is vaiable to say the least. The physical box has 12 Gig RAM and 4x Xeon 2.26 GHz CPUs. vCentre is telling us the CPUs are not maxed out at any time, RAM is fine too. When we issue a request to the host it sometimes maxes out all 4 vCPUs. Sometimes it's quick, sometimes very very slow. There doesn't seem to be a pattern. Any ideas?

    Read the article

  • How to calculate proper amount of inode/block sizes for a linux filesystem.

    - by Donatello
    I have an old reiser filesystem which I'm going to convert to Ext3. The problem I have is to determine the proper block- and inode-sizes for this partition. The partition is 44 GB large and has to hold 3,000,000+ files of sizes between 1 kb and 10kb, how can I figure out the best ratio of inodes and blocksize? The below is something I tried which seems OK but makes the copying files incredibly slow. mkfs.ext3 -t ext3 -c -c -b 1024 -i 4096 -I 128 -v -j -O sparse_super,filetype,has_journal /dev/sdb1 Thanks.

    Read the article

  • How can I speed up my macro in Excel 2003?

    - by user144872
    I have a macro that copies data from one cell to another and uses a VLOOKUP formula, among other things. My spreadsheet contains nearly 2000 rows. When I run it in Excel 2003, Excel starts to slow down as the macro processes rows 500 and above. It gets even worse when it reaches the 1000th row. It takes more than 5 hours to complete. In Excel 2007, however, the macro runs for only half an hour. Can anyone help me find a good solution?

    Read the article

< Previous Page | 144 145 146 147 148 149 150 151 152 153 154 155  | Next Page >