Search Results

Search found 7216 results on 289 pages for 'low cost'.

Page 149/289 | < Previous Page | 145 146 147 148 149 150 151 152 153 154 155 156  | Next Page >

  • Offset AND incremental backup

    - by Pyrolistical
    I already do backups from my main computer to my server computer using synctoy. But now I also want to do off-site backup. My idea so far: have source hard drive (we'll call S) at home have backup hard drive at work called B have transport hard drive called T connect T at work and record index of files on B take T home and check index of S and note new/changed/deleted files and copy changed files to T take T to work and update S repeat Its basically a sneakernet and using all of the advantages of it. High bandwidth, low latency. Is there some software to do this, or do I have to write it myself?

    Read the article

  • If i make a mail server can i send bulk email?

    - by Jake Smith
    I work for a small company and we have fallen into the fad of "email campains" a.k.a Junk mail. So far the company has gotten a subscriber list from our website, and paid a good chunk of change for a emailer program. The problem is, Our list has close to 4,000 people on it and growing. with gmail only allowing 100 emails per account through on SMTP and I am on a tight budget so I cant hire anyone else. I was thinking of doing a dedicated mail server off of the website server we have running in the office. Is it possible? to make emails on your own server, and then send it through your own SMTP? if it is, what software would I need and is if free or low cost at least. We run a WAMP server, i set it up just for information, but i could switch it to lamp or whatever if need be. Thank you for your time and youre answers

    Read the article

  • How to have a soft-real-time process in presense of heavily swapping IO-intensive background load?

    - by Vi
    schedtool: PID 32301: PRIO 4, POLICY R: SCHED_RR , NICE -20, AFFINITY 0xf ionice: realtime: prio 4 But the music is stumbling anyway. Background load is low prio (SCHED_IDLEPRIO, idle ionice), but uses a lot of memory (more than is physically available) and does a lot of IO and calculations. Latencytop shows about 1500ms for: Following symlink Writing buffer to disk (sync) Page fault Writing a page to disk both for the bg load and for unrelated processes. Load average is 10 and counting. Why cannot it allocate, for example, 200MHZ of one of the cores and 32M of memory and not less than once per second opportunity for IO for mplayer to make it happy while continuing calculations on the background? Or: why it cannot leave background task and swap loving each other but keeping the rest of the system as if there were no background load? How to have RT processes AND heavy bg load simultaneously (without of virtual machines)?

    Read the article

  • Run disk error check on NTFS file?

    - by paulius_l
    I have a feeling that my system hard drive is dying. Benchmark kind of enforces it. Here is the benchmark of my system hard drive during low system activity: And here is the benchmark of backup drive: Furthermore, there are some files which I just can't touch because I get CRC errors and the hard drive activity spikes to 100% with operating speeds less than 1 MB/s while working with such files. I haven't yet tried swapping SATA cable as I have read this might cause the problems. Anyway, I would like to run some tests on specific clustsers where those files I am interested in are stored. I don't want to do the full chkdsk because it takes a very long time. I would like to either find the utility which executes the disk check directly on the clusters where the file belongs or a couple utilities where one tells me the cluster locations and another can check just those locations. How do I check and possibly fix disk errors where the files I am interested in are stored? Edit: S.M.A.R.T. info:

    Read the article

  • Increase the compression performance of VPN

    - by Martin
    I am currently switching from a system with HPN-SSH tunnels and enabled compression to something VPN based. I have tried tinc and n2n so far, hamachi requires a library I do not have. In my primitive benchmarks I am not satisfied with the achievable bandwidth compared to the SSH tunnels. In tinc the low LZO setting performed best, but compression is only available in UDP mode. Ideally I would like to have a TCP-based VPN with a multi-threaded compression. Can you suggest me some ideas how to increase the performance? Would it be possible to somehow put a compression filter in front of the tun interface? Or are there any VPN implementations that might be better suited for my needs (fast compression, TCP-based, switch mode, does not have to be super-secure)? I would consider tunnelling Ethernet over SSH, but according to some articles it is not advisable.

    Read the article

  • How to make VIM settings computer-dependent in .vimrc?

    - by Paperflyer
    I share my VIM configuration file between several computers. However, I want some settings to be specific for certain computers. For example, font sizes on the high-res laptop should be different to the low-res desktop. And more importantly, I want gVIM on Windows to behave more windowsy and MacVim on OSX to behave more maccy and gVIM on Linux to just behave like it always does. (That might be a strange sentiment, but I am very used to switch mental modes when switching OSes) Is there a way to make a few settings in the .vimrc machine- or OS-dependent?

    Read the article

  • Anyone have any experience with bargain laptop batteries?

    - by chris
    I've got an oldish D820 that's got a 100% dead battery. I know that I could, in theory, take it apart and replace bad cells in the battery. I'm not really comfortable with doing that. I also know that there are various places that sell replacement batteries for 20% to 80% of the cost that Dell would charge. Does anyone have any experiences with buying more than a couple of these off-brand batteries? If a battery goes boom, it could be really ugly, so I'd rather not risk it, but at the same time, the dell batteries are really expensive... Any opinions on these ebay / off-brand battery vendors? Thanks!

    Read the article

  • Tools to automate recording streaming radio

    - by Stan
    Are there any tools that can automate recording online streaming radio? I've been using Total Recorder which has the following useful features: Handy scheduler Supports creating recording templates, so I can customize some high/low quality recording Unfortunately it requires opening the streaming radio in a browser and can't have another sound source at the same time; it's recording what comes out from the speaker. What I am looking for is given an online radio URL, the tool should be able to record the audio stream, no matter if I am playing any other music or not. Does such a tool exist?

    Read the article

  • Ethernet 802.1x client -> WiFi AP on a Raspberry Pi?

    - by Martin Janiczek
    I have an Ethernet connection that requires 802.1x authentication (TTLS, MSCHAPv2, name+password). My goal is to connect that to something that would then act as an WiFi AP, so I can use the connection on more devices (iPhone, notebook, etc.) Would it be possible/good idea to use Raspberry Pi for this purpose? Or are there better-suited devices to do this? EDIT: found some alternatives but because of low rep can't post more than two links... OpenWRT + wpa_supplicant guide Carambola - works with OpenWRT (but probably not standalone?) Hornet-UB - works with OpenWRT Asus RT-N10+ + OpenWRT how-to

    Read the article

  • Rolling desktop recorder?

    - by lance
    I want a piece of software that will constantly record what I'm doing on my desktop, discarding footage that's over [30] seconds old. Its recording would be a rolling one. The idea is that I can somehow hit a button and see "what just happened". I don't want to have to babysit it. That is, I don't want a piece of software designed for screencasting (which I'm not trying to do). My bias against that is based on my (maybe incorrect?) assumption that I'd regularly have to start/stop the recording throughout the day. The idea is that this piece of software would consume fewer resources (than a screencast recorder) on my box, as it's only keeping a very limited amount of footage in memory (and low quality would even be acceptable), because it's discarding frames fairly quickly after they're captured. Where can I find a piece of software with features like this?

    Read the article

  • SSRS Errors "Use Local", even though I am

    - by Corey Coogan
    I am at a loss. I posted this on SO, but think this is probably a better place. I have searched high and low and don't know what to do. I am running SQL Server Web Edition on Server 2008, which only supports local databases. I am trying to connect to localhost, but when I test my connection, I get this error. The feature: "The edition of Reporting Services that you are using requires that you use local SQL Server relational databases for report data sources and the report server database." is not supported in this edition of Reporting Services. The DB was upgraded from SQL Express and when I select @@version, it says it's Web Edition. I've tried rebooting and that seemed to fix it, but only for a little while.

    Read the article

  • Apache conf for high trafic CMS with backend users?

    - by Annan
    I'm in the situation where a website is going to have a high number of web users and a few backend webmasters. Webmasters will upload images (+other high mem tasks) and this bumps up the memory allocation of the httpd child processes to 100-150mb. In order to stop swapping I'm currently setting MaxClients in httpd.conf to 20. However this lowers maximum simultaneous requests. Will this be a problem when the website goes live? What is the best configuration? Info: Drupal 6, PHP 5, Apache 2.2 (Prefork atm) I'm thinking about Worker MPM, two apache instances or low MaxRequestsPerChild.

    Read the article

  • What is the computer "doing" when it is running slow and task manager is not showing any CPU activity?

    - by Joakim Tall
    Typical example is when shutting down a memoryintensive application. It can take quite a while before the computer gets back up to speed. Is there some inherent cost in releasing memory? Or is it throttled by some kind of harddrive activity, and if so is there any good way to track that? I usually bring up task manager when a computer is running slow, and usually sorting by cpu activity can show what process is causing the problem, but sometimes there is no activity showing. And yes I "show processes from all users", I have been wondering this since the days win2k :)

    Read the article

  • Ubuntu/Linux version recommendation for HP dv6 3122TX?

    - by sanjayav
    I purchased a HP dv6 3122TX recently and after installing Ubuntu 10.10 64 bit I ran into multiple issues like, The wireless driver is not supported by Uubntu. (1) [The driver is RaLink RT3090 ] The ethernet stopped working sometimes for no reason [The driver is Realtek RTL8111/8168B ] "Corrupted low memory at ..." issue which is described as a kernel bug in Ubuntu support forums. (It started to take me to a terminal instead of the GUI and couldn't start x server after that) As I'm not an expert Ubuntu user I got fed up of all these issue and got back to Windows 7. But I need an Ubuntu installation up and running for my development work. What are your suggestions about a reasonable Ubuntu version that I should try? Or a different Linux variation? Should I stick to a 32-bit version? It'd be great anyone can give some advice on this issue.

    Read the article

  • ASA Slow IPSec Performance

    - by Brent
    I have a IPSec link between two sites over ASA 5520s running 8.4(3) and I am getting extremly poor performance when traffic passes over the VPN. CPU on the device is 13%, Memory at 408 MB, and active VPN sessions 2 so the load on the device is particularly low. Screenshot of wireshark file transfer between the two hosts over the VPN: The large amount of Header checksum failures is alarming, but I am not sure what to check now. I perf is showing around 4-5 Mbit/sec with differing TCP window sizes. Show Run on the ASA http://pastebin.com/uKM4Jh76 Show cry accelerator stats http://pastebin.com/xQahnqK3

    Read the article

  • Automating the insertion of credits in video files on Mac OS X

    - by Roberto Aloi
    I have a bunch of video files (mp4). I need to insert some title information at the beginning and some credits at the end. I'm currently using iMovie. Since the title could be extracted from the filename and the credits are always the same, I'm wondering how could I make all this process as automatic as possible. I was thinking about Automator, but I'm open to any other solution. iMovie is the preferred tool so far, but I could use anything else (as far as it doesn't require additional licensing/cost). Any idea?

    Read the article

  • Microsoft CALs for Domain Controller

    - by Damo
    I am designing a network and I've come to the point of specifying out the number of CALs required for this network. Microsoft licensing has always confused me, it's just not always clear to me. I plan to have 1 2008 std domain controller, another 2008 server (not a domain controller) and 200 Windows 7 devices connected to the domain for domain services. The 200 W7 devices will all authenticate to the domain controller with the same domain account. (this is a special type of network, not a user workstation network) Therefore, do I need to purchase 200 CALS for the 200 devices, or can I purchase say 10 CALS (user CALS) as the amount of unique user accounts is very low. Many thanks for looking.

    Read the article

  • startup cassandra layout

    - by davidkomer
    We've got a relatively low-traffic site (~1K pageviews/day) hosted on a single server, and expect it to grow significantly over the next few years. I'm thinking of moving over to Rackspace CloudServer or EC2 and firing up 3 nodes (all on CentOS): 2 x Web (Apache) - with loadbalancer 1 x MySQL (for the Wordpress powered part) The question is where to put Cassandra right now... Should it sit on each Web node, or the MySQL node? My thought right now is to put it on Web nodes. It's my understanding that Cassandra has the benefits of fault-tolerance (i.e. if we take a node down, the site is still operational). So even with only 2 nodes, we'd have that benefit as opposed to just putting it on the MySQL node. Also, as we scale up and add another node, a cassandra instance can come along with it and the php can always run its queries on localhost. Is this a good idea?

    Read the article

  • Tracking costs within one AWS account

    - by caius howcroft
    I have what I'm sure is a very common problem. Our company has many projects and groups working for different clients. We do a lot of our development work in the cloud and deploy our solutions there. We have a VPC set up that isolates projects from each other in their own subnet and that VPC is getting a hardware VPN connection back to HQ. We need to keep track of the cost run up by every project. The way I currently implement this is by providing my own tools for starting and stopping instances which log which user (and thus which project) to bill the instance too. This works okay for BoxUsage costs but not for other costs. I could create a separate account for each project and use consolidated billing, this I think would allow me to pay once but track costs per "project", but I would then not be able to share common resources (like bring account B's running instances inside the same VPC). Does anyone have any suggestions? Cheers C

    Read the article

  • Text template or tool for documentation of computer configurations

    - by mjustin
    I regularly write and update technical documentation which will be used to set up a new virtual machine, or to have a lookup for system dependencies in networks with around 20-50 (server-side) computers. At the moment I use OpenOffice Writer with text tables, and create one document per intranet domain. To improve this documentation, I would like to collect some examples to identify areas where my documents can be improved, regarding general structure and content, to make it easy to read and use not only for me but also for technical staff, helpdesk etc. Are there simple text templates (for example for OpenOffice Writer) or tools (maybe database-driven) for structured documentation of a computer configuration? Such a template / tool should provide required and optional configuration sections, like 'operating system', 'installed services', 'mapped network drives', 'scheduled tasks', 'remote servers', 'logon user account', 'firewall settings', 'hard disk size' ... It is not so much low-level hardware docs but more infrastructure / integration information in these documents (no BIOS settings, MAC addresses).

    Read the article

  • SQL server availability issue: large query stops other connections from connecting

    - by Carlos
    I've got a high-spec (multicore, RAID) server running MS SQL 2008, with several databases on it. I have a low throughput process that periodically needs a small amount of information from one of the DBs, and the code seems to work fine. However, sometimes when one of my colleagues does a huge query against one of the other DBs, I see full CPU usage on the machine, and connections from my app time out. Why does this happen? I would have thought the many cores and harddisks would somehow (together with cleverly written DB server) be able to keep at least some of the resources free for other apps? I'm pretty sure he doesn't use multiple connections for his query. What can I do to prevent this?

    Read the article

  • Why doesn't Ghost 2003 offer to fill the destination drive?

    - by Neil
    Because it is dangerously low on disk space, I want to upgrade an SBS 2003 server by replacing its existing 72GB drive with a 364GB drive. When I tried to use Norton Ghost 2003 to clone the disk it didn't suggest that I use the entire new drive. I'm worried that I caused the process to fail by overriding its decision - although the cloned drive boots in Safe Mode, if I try booting it normally then none of the SQL Express instances start and something causes the server to reboot before even the Ctrl+Alt+Del screen appears. Does Ghost 2003 know something that I don't? Or should I be using some other software?

    Read the article

  • Actual High Speed USB flash drive

    - by CSkau
    I'm looking to upgrade my EEE 1000H by possibly replacing the HDD with simple (internal) usb connected storage. The problem I'm having now is that I can't seem to find any actual high speed usb sticks. They all proclaim high speeds, but usually turn out to be ~30 mb/s - much lower than the 60 mb/s (480 mbit/s / 8 ) I understand USB 2.0 is at - no ? Can anyone enlighten me as to why no USB sticks seem to go past that low bar or alternatively point me in the direction of some actual high speed usb sticks ? Any help is greatly appreciated :) Cheers!

    Read the article

  • In Varnish, is it normal for the number of freed bytes to be 60% of those allocated?

    - by user331397
    I have an installation of Varnish 3.02 on an Amazon EC2 Medium Linux instance in front of two relatively low-traffic websites. After an uptime of 2 hours, there are 3400 objects in the cache. Using varnishstat, I checked the variables SMA.s0.c_bytes and SMA.s0.c_freed, which I assume correspond to the total number of bytes allocated since startup and the number freed, respectively. No objects should have had time to expire during these two hours, but still about 60% of the memory allocated since startup (330MB out of 560MB) has already been freed. Do you know if this is normal? If not, do you know what kind of configuration could be wrong?

    Read the article

  • I would like to have a publicly accessable linux box hosted elsewhere. Who provides this service?

    - by Eric Wilson
    I would like to have a general purpose linux server available and publicly accessible. I understand that there are no lack of web-hosting companies, but I might want more control over the machine than is typical. I would want the ability to install software, such as an SVN server, and I would like to be able to expose various port numbers, as I may have a variety of extremely low traffic sites that I would want to have available. Obviously, one option is to host such a machine in my home. Is that my only option? Or is what I describe out there, possible as a virtual machine on a larger server?

    Read the article

< Previous Page | 145 146 147 148 149 150 151 152 153 154 155 156  | Next Page >