Search Results

Search found 10417 results on 417 pages for 'large'.

Page 205/417 | < Previous Page | 201 202 203 204 205 206 207 208 209 210 211 212  | Next Page >

  • Should I bother upgrading my Opteron 270 Server?

    - by MousePad
    I have an Opteron Server machine (in a large workstation class case) running on the Tyan 2895 motherboard. It's a dual CPU socket board, but I only have one 270 in there. I have 4GB of RAM, but less than 3GB is addressable, even in 64bit mode, due to the way the board is designed. Is it worth spending a few hundred on an additional CPU and maybe some more RAM? The other problem is that one of the two SATA ports on the board had its wire socket break off. So only one drive can be run as of now. I could have it repaired, but at what cost? Add in the fact that the power supply is gunked up with dust and it's a bit of a nightmare. I actually work about it getting too hot. Seems that for the money I could buy a new server rack from Dell, but it also seems a shame to waste an otherwise working, and for my needs still very fast machine.

    Read the article

  • Should we regularly schedule mysqlcheck (or databsae optimization)

    - by scatteredbomb
    We run a forum with some 2 million posts and I've noticed that if left untouched the overhead in the mySQL (as listed in phpMyAdmin) can get quite large (hundreds of megabytes). I'm wondering if scheduling a normal mysqlcheck to optimize the tables is good practice? Any reason not to do it, say, once a week at an off-peak hour? There was a time over the summer where our site was constantly crashing because mysql was using up all resources. That's when I noticed the huge amount of overhead and optimized the database and haven't had any problems since then with stability. I figured if that was helping alleviate the issues, I should just setup a cron to automatically do this.

    Read the article

  • Windows 7 search for files with special characters in name.

    - by Luke
    I have a rather large source code repository on my machine; it is not indexed by Windows Search. I am trying to find some oddly-named generated files of the form .#name.extension.version where name and extension are normal names and extensions and version is a numeric value (e.g. something like 1.186). On Windows XP I could find these files by searching for .#*; on Windows 7 that just returns every single file and directory. So my question is this: is it possible to find files named like that using the built-in Windows 7 search functionality? I did find this question which is very similar, but the answer doesn't work for me; it seems like any special character I put in the query is either ignored or treated as a wildcard, and as a result it matches every single file and directory. Is there perhaps some registry value I can set to make the search-by-filename feature work with special characters?

    Read the article

  • What's faster, cp -R or unpacking tar.gz files?

    - by Buttle Butkus
    I have some tar.gz files that total many gigabytes on a CentOS system. Most of the tar.gz files are actually pretty small, but the ones with images are large. One is 7.7G, another is about 4G, and a couple around 1G. I have unpacked the files once already and now I want a second copy of all those files. I assumed that copying the unpacked files would be faster than re-unpacking them. But I started running cp -R about 10 minutes ago and so far less than 500M is copied. I feel certain that the unpacking process was faster. Am I right? And if so, why? It doesn't seem to make sense that unpacking would be faster than simply duplicating existing structures.

    Read the article

  • iTunes high CPU usage

    - by Calm Storm
    I upgraded to iTunes 10.4.1 and use Windows 7 and my itunes library is not that large at all (say about 20gb) When I start iTunes the CPU goes between 60-80% and stays there for a long time. I see that the itunes.exe takes about 70% of CPU in Process Explorer and it spawns a SearchProtocolHost.exe every 2 mins or so which takes < 0.1% CPU. Other than that iTunes.exe is always at 70-90% and never lets me do anything else. Does someone have a suggestion? EDIT: I have tried reinstalling 10.4.1 completely deleting my library and starting with a plain installation and that does not work I have tried downgrading to 10.3.x and that does not work either :(

    Read the article

  • Limiting memory usage and mimimizing swap thrashing on Unix / Linux

    - by camelccc
    I have a few machines that I machine that I use for running large numbers of jobs where I try to limit the number of jobs so as not to exceed the available RAM of the machine. Occasionally I mis-estimate how much memory some of the jobs will take, and the machine starts thrashing the swap file. I resolve this by sending the kill -s STOP to one of the jobs so that it can get swapped out. Does anyone know of a utility that will monitor a server for processes by a specific name, and then pause the one with the smallest memory footprint is the total memory consumption reaches a desired threshold so that the larger ones can run and complete with a minimum of swap file thrashing? Paused processes then need to be resumed once some existing processes have completed.

    Read the article

  • In-House DropBox

    - by beardedlinuxgeek
    Dropbox is perfect, but as a company, no one can host anything worthwhile on servers that we don't control. So I've been tasked with coming up with a Dropbox alternative, something in house. GlusterFS is nice, but no offline access. SparkleShare uses Git which isn't great for large files. It also doesn't have windows ports. Any other options? If I were to roll out my own from scratch, what do you think the based way to go about doing this would be?

    Read the article

  • Multiple FIle Upload in FTP Using CMD

    - by user697363
    I've large number of files, over 10,000, which I want to upload in FTP server. Now, I can't zip those files & upload it as I've to read those files individually in SAS software for my analysis. If I use mput commant, then the prompt as me to say "y" eact time it tries to upload the file. This is very cumbersome. Is there any method by which it automatically upload the files without I've to manually entering "y" each time file wants to upload. The command I was using was: ftp ftp.myftp.com *my ftp server name username:myusername password:mypassword ftp> lcd c:\local_folder ftp> mput *.*

    Read the article

  • Searching for online database software/cms

    - by ButterdBread
    I am searching for a software or CMS that manages and displays large online databases, as some kind of frontend to MySQL or any other database. It should be accessible through the browser, be as secure as possible (offering login). The data I'd like to store would be personal information such as name, adress and birthday - also I'd need to be able to add custom fields as well. Also forms and the possibility to download the data in an excel? table would be great. PHPmyadmin is not an option, it should be similar to a CRM but more closely adapted to managing database tables, searching for entries and filtering data. It should be possible to have many user accounts with different rights, with each of them being able to acces certain parts of the data and entering own data. Is there something out there, that might get close to what I imagine? I appreciate any help!

    Read the article

  • Hosting solution for sensitive client data

    - by Mark
    Hello, We are developing a web application that will deal with highly sensitive (financial) data of clients (audience is medium to large sized businesses). Clients will be under scrutiny from regulators & auditors and, as such, we will be too. More importantly to give clients a level of comfort our application and related hosting arrangement should instill a lot of confidence with them. We are looking into using a cloud based service like Linode, Amazon EC2, etc. To allow for maximum flexibility We are keen on putting everything on virtual servers and avoiding having to buy our own hardware. Does a cloud based service make sense for our particular scenario? If not what type of hosting should we consider? If so what should we look out for? Thanks!

    Read the article

  • How can I tell if a host is bridged and acting as a router

    - by makerofthings7
    I would like to scan my DMZ for hosts that are bridged between subnets and have routing enabled. Since I have everything from VMWare servers, to load balancers on the DMZ I'm unsure if every host is configured correctly. What IP, ICMP, or SNMP (etc) tricks can I use to poll the hosts and determine if the host is acting as a router? I'm assuming this test would presume I know the target IP, but in a large network with many subnets, I'd have to test many different combinations of networks and see if I get success. Here is one example (ping): For each IP in the DMZ, arp for the host MAC Send a ICMP reply message to that host directed at an online host on each subnet I think that there is a more optimal way to get the information, namely from within ICMP/IP itself, but I'm not sure what low level bits to look for. I would also be interested if it's possible to determine the "router" status without knowing the subnets that the host may be connected to. This would be useful to know when improving our security posture.

    Read the article

  • Hooking up many different external HDs simultaneously

    - by cbizz
    I need a large amount of external storage for an upcoming project. I'm planning on purchasing 10 2TB external drives. I need them all hooked up to a single machine at the same time. What issues will I run into? I plan on using 2 power strips and having them all externally powered from the wall. I will use a USB hub to plug in all the drives. I need drive access time to be as fast as possible. I am using Ubuntu Linux(64 bit). Will I be able to mount 10 drives?

    Read the article

  • Updating shared files across computers

    - by murgatroid99
    I have a file server running Windows Server 2008 and a couple of laptops running Windows 7 on a network. There are a large number of files that all users will need access to. My plan is to have the files on both the server and the laptops because the users will need to access the files in places with no Internet access. I also want any changes made to the files on any of the laptops to propagate to the server and then propagate to the other laptops whenever they connect to the network. Should I do this with a scheduled batch script with a few xcopy commands or is there a better way to do it?

    Read the article

  • how to protect php app (vbulletin) from hackers

    - by samsmith
    Our vBulletin system is under constant attack, raising cpu load and making the system very slow for legit users. The attack is a script type attack that is attempting to log in and/or create new login ids (mostly it is trying to create login ids in order to spam the site). In vBulletin, we have black listed large ranges of ips, which has helped a lot, but the attacks continue. Is there an automated way to protect the application or web server? ideally, the protection would detect the pages accessed and automatically black list the ip.

    Read the article

  • Trying to figure Windows 2008 Server and Active Directory - complete novice!

    - by Simon E
    I am a novice in networking and I want to learn more about Windows Server 2008 and Active Directory. Basically I have serveral Windows 7 PC's in my home and an IBM server with Win Server 2008 installed. I have enabled Active Directory and set the server to have a static IP. Thats as far as I have got and I feel a little overwhelmed at the moment as I am not sure what to do next. What I eventually want to do is have the Windows 7 PC's to be able to be authenticated by the server - basically a mock up of what most large or medium sized businesses/organisations run. Sorry if I sound a bit sketchy, but I understand some things but not the whole picture, I guess I just need confirmation that I am sort of on the right track :) Also what are some good resources out there you can point me to, what are the absolute basics I need to to get started so I can further my learning?

    Read the article

  • What is excessive swapping.

    - by amateur barista
    This post led me to ask that question. Cache contention On a large site, if you are using MyISAM, contention occurs in the database tables when the cache is forced to clear after a node or a comment is added. With tens of thousands of filter text snippets needing to be deleted, the table will be locked for a long period, and any accesses to it will be queued pending the purge of the data in it. The same is true for the page cache as well. This often causes a "site hang" for a minute or two. During that time new requests keep piling up, and if you do not have the MaxClients parameter in Apache setup correctly, the system can go into thrashing because of excessive swapping.

    Read the article

  • Syncing 1TB+ to a iSCSI device, software needed

    - by mojah
    Hi, I need to sync a local disk to a iSCSI mount on Windows (server 2003), and I'm struggling to find software that's capable of doing so in a reasonable timeframe. Notes on the current 1TB disk: - 800GB currently in use - Contains a folder with several hundred thousand subfolders, which in turn have several thousand files, ... So I'm trying to find a piece of software that can handle such large filelists, and give me a good timeframe on when this will be copied. I've tried DeltaCopy (the rSync GUI client for Windows), but it's intolerably slow and doesn't provide me with a good estimate time remaining. DeltaCopy: http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp Does anyone know alternative software for Windows, that would do this well?

    Read the article

  • Folder sync application which can sync over Internet (the other machine specified by an IP)?

    - by Adal
    I need to sync some folders between two Win 7 machines. While they are connected to the same LAN, they can't see each-other over Windows Networking since sharing is disabled on both of them (security reasons). Do you know any sync app which can work over IP? The folder I need to sync has 500,000 files in it (80 GB in total), so the sync app should be pretty efficient. At the moment I copy the files from one machine to the other over FTP, but it takes forever, since a separate connection is opened for each file. Or maybe you know some app which can efficiently transfer a large number of files between two machines on the Internet?

    Read the article

  • Increase Volume of an MKV Video from Linux Terminal

    - by The How-To Geek
    I've got a large amount of .MKV video files which seem to all play at a very low volume - I end up having to turn the TV up all the way to hear them, which is really irritating when I switch to another channel and wake the dead because it's so loud. What I'm looking for is a command-line method to increase the volume (so I can run it on all of them quickly) that would hopefully work regardless of the audio codec in use in the particular file. (I don't mind hard-coding the output audio though). For reference, I'm using Ubuntu 9.04 on my server, and the files are being played back with Boxee on a Mac Mini, but the volume problem is the same on Windows too.

    Read the article

  • Mac OS X 10.5 VNC Resolution independent of hardware display

    - by Qberticus
    Is it possible to set the VNC display resolution that is independent of the hardware resolution when you are using OS X 10.5 Screen Sharing? I have a macbook and a windows box with 3 monitors. I'd like to use the 3 monitors on my windows box to do work on my macbook when I'm at my desk. When I VNC into the macbook I only get the resolution of the hardware screen (1280x800). Instead I'd like to use two of my monitors on my windows box to display a large VNC screen from my macbook. The scaling options in the VNC clients (TightVNC and Ultr@VNC) do not adjust the actual resolution of the display they just do image processing. My ultimate goal is for someway to have a virtual display on my windows box that is from my macbook that is independent of the macbook's hardware screen.

    Read the article

  • Best server sync software/methods [closed]

    - by Meep3D
    I have a test server at home and a test server at the office. I'd like to somehow sync multiple folders in both directions automatically so I can work at home and to also provide an offsite backup. I've tried Live Sync (Microsofts own product) but it chokes on large amounts of files and seems a bit rudimentary. Dropbox is also a bit small and does not adapt to our filesystem setup. I have seen a few online backup services but none seemed geared to multiple computers using the same account. I don't mind paying a monthly fee provided the service is good. Suggestions would be greatfully appreciated!

    Read the article

  • Central Storage for windows user accounts homedirs .. hardware/software needed?

    - by mtkoan
    We have ~120+ users in our network, and are endeavoring to centralize logon authentication and home directory storage server-side. Most of the users are Windows 2000/XP machines, and a few running Mac OS X. Ideally the solution will be open-source-- can this all be managed from a Linux server running LDAP and Samba? Or would a hacked-NAS Box with a FreeNAS or similar suffice? Or is Micro$oft's Active Directory really the preference here. Is it viable to store PST files on this server for users to read from and write to? They are very large ~1.5gb. We have no mail server (or money) capable of Exchange or IMAP, only an old POP3. What kind of hardware horsepower and network architecture should we have for this kind of thing?

    Read the article

  • How to secure a VM while allowing customer RDS (or equivalent) access to its desktop

    - by ChrisA
    We have a Windows Client/(SQL-)Server application which is normally installed at the customer's premises. We now need to provide a hosted solution, and browser-based isn't feasible in the short term. We're considering hosting the database ourselves, and also hosting the client in a VM. We can set all this up easily enough, so we need to: ensure that the customer can connect easily, and also ensure that we suitably restrict access to the VM (and its host, of course) We already access the host and guest machines across the internet via RDS, but we restrict access to it to only our own internal, very small, set of static IPs, and of course theres the 2 (or 3?)-user limit on RDS connections to a remote server. So I'd greatly appreciate ideas on how to manage: the security the multi-user aspect. We're hoping to be able to do this initially without a large investment in virtualisation infrastructure - it would be one customer only to start with, with perhaps two remote users. Thanks!

    Read the article

  • Backup/Multihomed network connection

    - by J_P
    We have a couple locations that require 24/7 access to Internet and our current provider (AT&T) while mostly good is not always up. My concern would be if I go with another provider (for example Comcast) I'm going to be subject to the same down time if it's in the "last mile". I for the most part don't know where the failure points are on the ISP side but I would imagine the large majority are within the last mile. I'd looked at Mifi or similar solution but have concerns about bandwidth caps and overall speed. Any suggestions would be appreciated.

    Read the article

  • Can you span a single file share across multiple servers?

    - by Mike C.
    Let's say I have one large file server that has a company-wide share. That file server constantly runs into space issues. Now let's say I have a dozen or so random servers (print servers, web servers, etc) that have a ton of free space. Is there a way to utilize that in the company-wide file share? Like spanning the storage across different servers but making that transparent to the user? I know the likely answer is going to be get more storage for the file server, but I just curious if this is conceptually possible. Thanks!

    Read the article

< Previous Page | 201 202 203 204 205 206 207 208 209 210 211 212  | Next Page >