Search Results

Search found 4100 results on 164 pages for 'recommend'.

Page 126/164 | < Previous Page | 122 123 124 125 126 127 128 129 130 131 132 133  | Next Page >

  • Home Server: cpu virtualisation, what to choose?

    - by Huygens
    I'm looking for virtualisation solutions for storage and OS for a home server. A sort of private cloud where I manage the storage space independently of the VM one. This question focus on VM (or compute instance) management and what would best suit my needs. (I have another question related to the storage management). My use cases are: A backup server: rsync and other services running. A personal cloud server: a kind of owned dropbox system, à la ownCloud. " users foreseen. A media server: streaming videos and displaying photos. Here my environement and wishes: Server: HP Proliant MicroServer with 8 GB RAM (AMD Turion dual core with AMD-V technology) OS types: only Linux (perhaps a *BSD VM in the future) Linux distributions do not matter, I'm familiar with RHEL, Fedora, Suse, Ubuntu, but any other recommandation will be fine 2-3 VMs foreseen: backup server, owncloud server and media server (optional). Those are only servers, so no graphical console needed (I don't need VirtualBox) By VM I mean a virtualised environment like KVM, Xen, etc. or a compute instance like with OpenStack storage should be "virtualised/cloudified" see my other question. VM should be able to be migrated to another server in the future if performance cannot be fullfilled anymore by the current server It does not matter if installation of such setup is complicated as long as management tools allow for easy maintenance I don't have Windows at home, so solution should be Linux friendly and would be nice to be web based. But native apps are OK too. System should be easy to enhance: by adding a new server to migate some of the VMs to it. So it's really a kind of private cloud on which I could run some Linux OS. I would prefer free (libre, as in a free speach) and open source tools. But it does not have to be free as in a free beer. So Xen, KVM, VitualBox or OpenStack? What would you recommend?

    Read the article

  • Drobo-like linux file server - how do I do it?

    - by John Hunt
    I've been pondering for a long time about how I can set up a server which operates much like the Drobo storage thing. The reasons I don't actually want a drobo is because I've heard scare stories, plus I'd like to do this on the cheap. So ideally I'm looking for something like lvm so I can create a logical volume that spans many hard disks of varying sizes... obviously that only offers redundancy if I put the LV on a raid array (as far as I know..) I have however been reading about technologies such as Microsoft's drive extender which duplicates files at the filesystem level and makes sure that the mirrored files are on a different phyiscal disk.. does anyone know or recommend a filesystem or method like this as it'll hopefully make much better use of the space available than raid ever could. Performance isn't an issue, I'd just really like to make the most of the hard disks I have lying around whilst having a bit of redundancy incase a disk dies. I understand full well that this is no replacement for a backup, but I'll only be storing files of medium importance and using the nas itself as a backup of my main pc and other systems. Thanks in advance! I'm hoping zfs or btrfs or something can do something clever for me :)

    Read the article

  • Nginx, HAproxy, Unicorn, Rails and Node settings

    - by Julien Genestoux
    Our application is currently only a "regular" web app, with no fancy things like streaming HTTP or websockets. It's mostly a Rails app, served by a few (20 on 2 machines) Unicorn workers, proxied by a venerable nginx server which deals with load balancing. This has been working quite well for the past year and the app now serves between 400 and 800 requests per second at any point during the day. We're soon releasing 2 new APIs, which are both served by a Node application : a websocket one, as well as a long polling HTTP one. (the fancy thing like the Twitter streaming API where HTTP connections never end). They both use the same port on node and since the node app is stateless, we can certainly deploy a few of them to handle the traffic. The app (node) is now deployed in 5 instances and are now listening on 5 different 'private' ports on the same host. We need to put something in front of them to load balance, but also something that is able to deal with sockets (either websocket or HTTP streaming) which are intended to stay 'up' for days. The question is then : what? I read somewhere that HAProxy does a better job than Nginx at this. What do you recommend?

    Read the article

  • Are These Parts compatible?

    - by ell
    I have never assembled a PC before, although I have taken an old one apart and replaced a few parts in others here and there so I have (very) limited experience. I have been looking to make a pc and here are the parts I might buy: Foxconn P45AL Intel P45 (Socket 775) DDR2 Motherboard (with onboard sound I believe) Gigabyte GeForce GTX 460 OC 768MB GDDR5 PCI-Express Graphics Card Already have 2 1gb sticks of dual channel DDR2 memory Intel Core 2 Quad Q8400 LGA775 'Yorkfield' 2.66GHz 4MB-cache Processor Samsung SpinPoint F3 1TB SATA-II 32MB Cache Hard Drive Antec Dark Fleet Series DF10 Gaming Enclosure – Black I already have monitor, mouse, keyboard and DVD/CD drive Akasa Freedom Power 1000W Modular Power Supply I have never done this before so feel free to laugh at me for getting something obvious wrong, forgetting a vital component etc. but is all of this compatible? And have I gone overkill on the PSU, if so, please recommend one. Thanks in advance, ell. EDIT: Added PSU which I forgot to mention EDIT: I would be using this to surf the internet, write e-mails, chat, word process, play games such as team fortress 2 & spring rts (at highest graphics hopefully), some 3d modelling in blender, some opengl programming, and image editing in GIMP.

    Read the article

  • Looking for "bitmap-vector" image editor

    - by Borek
    I used to use PhotoImpact which is no longer developed so I'm looking for a replacement. What made PhotoImpact great to me was the ability to work in both bitmap and vector modes. What I mean by that: I could have an image or screenshot and easily add arrows, text captions or shapes to it. These shapes were vector objects so I could come back to them later and amend their properties easily. Software I know of: Paint.NET is purely bitmap so please don't recommend it - layers are not enough for my needs Drawing tools in MS Office work pretty much the way I'd like - you can paste an image and then add vector objects on top of it. It just doesn't feel right to have the full-fidelity original images stored as .docx or .pptx (I don't fully trust Word/Powerpoint that they don't compress the image) I'm not sure about GIMP but if it's just "better Paint.NET" (i.e., layers but no vector objects) I'm not interested Photoshop is out of question purely because of its price tag Corel killed PhotoImpact because they already had a competing product (Paint Shop Pro) but AFAIK it lacks vector features. Any tips for PhotoImpact alternatives would be very welcome.

    Read the article

  • How do I troubleshoot a problem syncing Google contacts to an iPad?

    - by Daryl Spitzer
    I use my MacBook Pro to sync content onto my wife's iPad. (She doesn't have a computer.) She doesn't want all my contacts from the Address Book app on my MBP. But she does want her Google contacts on her iPad. I've tried the following settings in iTunes: I created a group in my address book called "Claire's" (and put just a couple contacts in it), since if one enables "Sync Address Book Contacts" one either has to select "All" or at least one group. I've double-checked her email address in the dialog that comes up after pressing the "Configure" button. But after syncing, only the couple contacts in the "Claire's" group are in the Contacts app on her iPad. I've checked her Google contacts, and she has over 2000. For some reason they're not syncing. How do I find out why they're not? I looked to see if I could just use an app to do the sync on the iPad, but couldn't find one with good ratings. Do you have one to recommend so I can give up struggling with getting this working in iTunes?

    Read the article

  • Apple Magic Trackpad 3-Finger Drop Lag

    - by activestylus
    After enabling three-finger dragging for my Trackpad, I notice that it drags well, but when I release there is about 1-2 seconds of lag before it actually drops. I understand this is supposed to be a feature so when you run out of space to drag, you have time to move your hand. But, for those of us powerusers, who move really fast, this is a BUG, not a feature. There should be some way to turn it off! For some perspective, I personally own a Fingerworks trackpad as well (the company Apple bought to make the Trackpad) and it does not suffer this problem. Drops are instantaneous no matter what program I am in. This is hugely frustrating for me, because I thought I was upgrading here and Apple's version does not perform as well as the Fingerworks model (which I purchased in 2004) I actually made a short video illustrating the problem, and why it is so frustrating for anyone who uses the pad as an artistic tool. Anyone here face this problem? If not, how would you recommend that I address Apple directly about this? PS - Already looked at this thread and the conclusion does not help me. I do not have one-finger drag enabled. PPS - I understand that for most people this is not an issue because they use the 'click' feature of the Trackpad. However, after years of using Fingerworks and not having to click ever, I find that it slows me down.

    Read the article

  • Hosting django backend for iPhone / Android app

    - by Ashok Fernandez
    I am looking to make an iPhone / Android app for my university using the Appcelerator Titanium framework. The app will rely heavily on a server backend which will pull information from other sites, figuring out what is relevant to the user then deliver the content. Some of the information is individual to the user (calendar data), other bits are updates frequently but are shared (bus timetables) and others are static and the same for everyone (magazine articles). I was going to use django as I am fairly proficent in python so I thought it would save time. My question is, which hosting services do you recommend to host the server backend? I am expecting about 9000 people to use the app with very random spikes in traffic, but unfortunately I have very little to go on at this stage. I have heard a lot about Webfaction, is it suitable for something like this or am I likely to need something bigger? I don't really want to fork out for a VPS at this stage. What about Amazons EC2? Would that be more suitable than Webfaction? Sorry for the fairly open ended question, Im sort of new to this so I open to all suggestions.

    Read the article

  • Recommended open-source firmware for ASUS RT-N16

    - by MasterF
    I have recently acquired an ASUS RT-N16 router. My original plan for it was to install Tomato on it. However, after checking their website i found out that the firmware was not updated in the last 2 years. There seem to be a few updated mods but none of them really seemed mature/stable/well-documented. I would like to know what other people recommend as open-source firmware for this router. I know the answers will probably be subjective; so i will give a bit of background on my needs: for now i will only use the Wi-Fi on an Android phone the connection will not be shared with anyone (so QOS is optional) i want a stable (wired) connection on my PC (for online gaming etc.) i want the (wired) download/upload speeds to be as close as possible to those achieved by directly plugging the Ethernet cable to the PC's network card; i have a 100 Mbps connection my ISP uses PPPOE my technical level: i am a software developer and i have good knowledge of bash scripting, but no experience with networking Also, i know that i could probably just use the stock firmware (and maybe will use it for a while), but i'm interested in trying an open-source version (for more features, flexibility, as a learning exercise etc.)

    Read the article

  • Pin the Dock to the top

    - by Chris Buchholz
    I wonder if it is possible to pin the Mac OS X Dock to the top in Snow Leopard? I see lets of ideas on how to do this when I google for it, and Secrets (the tweaking app) also provide it as an option, but I don't see any of the ways working for me. I guess it must have worked at some point, since people said it did, but I believe this feature might have been removed from Snow Leopard, and therefore does not work for me. Is this so? Is there really no way to pin the Dock to the top of screen? If not, what ways of "getting rid of the dock" can you guys recommend? I have tried with auto-hiding, but my problem is that this will leave a 4px line at the edge of where the Dock is pinned to, that applications wont cover. Thats not ideal for me. As far as I have understood from googling, this line will not appear if the Dock is pinned to the top, hence my question. What other ways do you guys use to get rid of it?

    Read the article

  • Backing up default windows installation with dd from linux running on another partition - is this fe

    - by Marek
    I am preparing to reinstall my system. I am thinking about creating a multi boot with a linux distro+Windows 7 to choose from when starting up. I would love to be able to skip all the hassle of reinstalling Windows and all programs when it starts becoming too slow in the future, thus I would like to mirror my fresh Windows system partition with some programs preinstalled. I am thinking about installing Ubuntu, making a partition for windows, installing windows with the basic environment (Visual Studio, Office, etc.) then booting into Linux and making an image of the windows partition with dd. I am not familiar with linux at all so I am a little afraid something may go wrong along the way. Is it possible to do it this way? Will I be able to partition my existing disk for multi boot easily after I install Ubuntu? Will I be able to recover the Windows partition easily using dd when I will need to re-create a fresh windows partition in the future? What other (better) approach can you recommend to achieve the goal of easy disk mirroring (for free)?

    Read the article

  • Best way to integrate applications to windows 7 install.wim image

    - by cyph3r
    I have right now an unmodified .iso of a windows 7 32bit and 64bit installation disk. And I need to integrate to that some applications (office, adobe reader etc) and windows updates so that when windows are installed the above applications/updates are already installed and working. Requirements: My output has to be a install.wim image containing the new/improved windows installation files because the deployment is done via a pxe server and a custom windowsPE enviroment. The procedure to create the install.wim has to be as automatic as possible. I can't create it manually every time I want to incorporate a new windows or application update to the image. The image will be installed on 100+ computers so it needs to be 'generic'. I've never done something like this before but from what I searched a possible solution to this issue would be: To create a reference installation (preferably on a vm so I can take snapshots) complete with its applications/updates/settings. After the complete setup I take a snapshot of the installation Run C:\Windows\System32\sysprep\sysprep.exe /oobe /generalize /shutdown to sysprep the machine. Boot to a WindowsPE enviroment and capture the .wim image using gimagex. Deploy the .wim and enjoy the rapid installation times. :D Does that sound ok? Would you recommend anything else? Right now the applications are installed after the installation of windows is complete. So the total installation time is quite long. That's why I need a different approach.

    Read the article

  • SUSE Linux and Xen on Mac Pro - How best to prepare and configure?

    - by Andrew J. Brehm
    This is a longwinded question, so bear with me please. I have a 2009 Mac Pro with two CPUs and 8 GB of memory which is totally overpowered for Mac OS X. I am also in the process of slowly moving away from Mac OS X as my main platform. Since the Mac Pro is really new and nice I have finally decided to use it for another platform. I am familiar with Linux and SUSE Linux. Ultimately I want to run some version of SUSE Linux (recommend one, doesn't have to be free as in no money) and Xen. Here are the individual questions: Which version of SUSE Linux should I use and how do I install it on a Mac Pro? Note that the distribution must come with usable Xen. I am willing to pay. I assume Xen will work on my computer (it has VT support etc.). Is my assumption correct? I want to run Windows 7 and another instance of SUSE Linux under Xen. Is it possible to run Mac OS X Server under Xen (on a Mac Pro)? Which email client under Linux supports imap is is best-suited for integrating with MobileMe? Does SUSE Linux support the ATI Radeon HD 4870 and the Apple Cinema Display 1920 x 1200 resolution? What else should I take into account?

    Read the article

  • Which version control should I use for my configuration files?

    - by rakete
    I want to store some of my configuration files (~/.emacs.d/, .Xdefaults, etc. linux $HOME stuff) in version control so I can easily sync them with my notebook/workplace and see my past changes and revert to them should the need arise. So far it seems to me that there are quite some people using git for this and I think that I too want to use a distributed vcs for this (if only to get more used to them) but I can't say that I am very experienced with all things dvcs. I did use darcs and git briefly and so far I can say that I really like the way git handles branches, and I think the possibility to have different branches within the same directory is especially useful for my use case. Darcs on the other hand has cherry picking of patches, which too is quite the convenient feature when managing configuration files (at least I assume it is). So, what would you recommend to use? And what would be your reasoning for your recommendation? What other vcs with nice feature that I haven't mentioned exist and would make a good vcs to store configuration files and why?

    Read the article

  • Pin the Dock to the top or disable it

    - by Chris Buchholz
    I wonder if it is possible to pin the Mac OS X Dock to the top in Snow Leopard? I see lets of ideas on how to do this when I google for it, and Secrets (the tweaking app) also provide it as an option, but I don't see any of the ways working for me. I guess it must have worked at some point, since people said it did, but I believe this feature might have been removed from Snow Leopard, and therefore does not work for me. Is this so? Is there really no way to pin the Dock to the top of screen? If not, what ways of "getting rid of the dock" can you guys recommend? I have tried with auto-hiding, but my problem is that this will leave a 4px line at the edge of where the Dock is pinned to, that applications wont cover. Thats not ideal for me. As far as I have understood from googling, this line will not appear if the Dock is pinned to the top, hence my question. What other ways do you guys use to get rid of it?

    Read the article

  • Is this way of using Excel 2007 Pivot table for BI scalable ?

    - by Sim
    Hi all, Background: We need to consolidate sales data across the country to do analysis Our Internet connection/IT expertise/IT investment is not quite strong, therefore full BI solution is out of question I tried several SaaS BI solution (GoodData, ZohoReports) and while they're good, they seem not to fully support what we need We're looking at 'bout 2 millions record for every 2 months My current approach Our (10) sites currently gathers data from all their branches and consolidate them into 1 Excel file with Pivot table and embed source data In HQ, I will request 10 sites to send back those Excel files periodically We will import those Excel to our MSSQL server There will be a master Excel file, that will also have the same pivot table (as those came from site Excel file), and datasource is the MSSQL server More details For testing, I currently use MSSQL 2008 Express on my laptop So far, I imported our transactions for the past 2 months and there are 2 millions+ row in 1 table in MSSQL (we just use 1 table, corresponding to our common pivot table structure). DB size is ~ 600 MB In the master Excel file, if not including the source data, it's just < 10MB. Including the source data will increase the size to 60 MB (so I supposed Office 2007 automatically zip the data ?) I try using the Pivot (drag-and-drop fields) and the performance so far is OK (my laptop specs: C2D T7200, 3GB RAM, Windows XP) So my question is : If we're looking at full year transaction (roughly 15 millions rows in MSSQL 2008 Express, 3.6 GB in size), is there any issue with that 15 million rows in 1 table in SQL Express ? Is there any performance issue with the pivot table at that time ? Can it still embed the source data ? (I google-ed but didn't find the maximum size of source data Excel 2007 can embed) Any other suggestions on how we can better do this ? Given that we can't afford the full BI solution, any light-weight/budget/SaaS BI that you can recommend ? Thanks

    Read the article

  • Is there a good alternative to Videora iPod Converter?

    - by Richard
    I use Videora to convert my videos (in DivX/XviD format) to something I can play on my iPod Classic. I really dislike it. It's clunky, riddled with adverts, sometimes doesn't convert properly (the infamous "invalid public atom" error - see Google for more) and has a UI that truly stinks. On the upside, it's free, accepts a list of video files (via the oddly hard to find "1-click convert" button) and just gets on with the converting as it already knows the correct settings for my iPod. One final nice touch is that once they are converted, it'll automatically upload them into iTunes. Are there any alternatives which have all the upsides but none of the downsides? Bonus points if they can set the metadata in iTunes correctly for TV shows (season, show, episode) and delete the converted file afterwards (as my iTunes settings means that a copy is made elsewhere). I've looked at a bunch of applications (handbrake, virtualdub, mediacoder, format factory, any video converter, convertxtodvd) but many of them fail the "just select a list of files and get on with converting" test - let alone all the other features I want. I have no desire to individually set the video size of each file or the codec or the post-processing options. I'm currently using the command line version of HandBrake (handbrakecli) and a hand-written DOS batch file to go through every file in a folder and convert it. It does most of what I want, just not in a very slick way. Can anyone recommend anything better? It needs to work on Windows 7 and be free.

    Read the article

  • Five stars of open data - example and review

    - by Joe
    (there may be a more suited SE site for this question so feel free to shift) I have some data I'd like to make open to the public - It's synatesis of some related data retrived from freedom of infomation requests over the last year. The data itself is at http://www.cs.rhul.ac.uk/home/joseph/domesday/Domesday-Scotland.csv or for fans of Excel, at http://www.cs.rhul.ac.uk/home/joseph/domesday/Domesday-Scotland.xlsx . It's no more than a table with about five columns. I'd like to make this properly open data, so I was looking at the 5 star deployment scheme for Open Data. Much of which is fine but I'm confused towards the end and I could do with an explenation from people who know the answers. So to get achieve the star levels I need: "make your stuff available on the Web (whatever format) under an open license" trival - all I have to do is put the notes up on the page that will give the provance of the data. "make it available as structured data (e.g., Excel instead of image scan of a table)"… done… "use non-proprietary formats (e.g., CSV instead of Excel)" - done… "use URIs to identify things, so that people can point at your stuff" - this is where I start to get a bit hazy - does this mean there should be an URI for every line in the table? "link your data to other data to provide context" - this isn't massively clear to me - does this mean to give the provence of the data? One column of the data I've put out is a link to where the data came from - is that the sort of thing we're looking at? Any and all information and answers welcome… EDIT - or if anyone wants to recommend a place SE or other place to ask the question - that would be cool...

    Read the article

  • EMC VNX iSCSI setup - unsure about SP/port assignment

    - by pauska
    We have a new VNX5300 waiting to get configured, and I need to plan out the network infrastructure before the EMC tech arrives. It has 4x1gbit iSCSI per SP (8 ports in total), and I'd like to get the most out of the performance until we jump over to 10gig iSCSI. From what I can read from the docs - the recommendation is to use only two ports per SP, with 1 active and 1 passive. Why is this? It seems kind of pointless to have quad-port i/o-modules and then recommend to not use more than two of them? Also - I'm a bit unsure about the zoning. The best practices guide state that you should separate each port on each SP from each other on different logical networks. Does this mean that I have to create 4 logical networks to be able to use all 8 ports? It also gives the following example: Does this mean that A0 and B0 should sit on the same physical switch aswell? Won't this make all traffic go on one switch (if both A1 and B1 are passive)? Edit: Another brainpuzzle I don't get it - each host (as in server) should not have more iSCSI bandwidth available than the storage processor. What on earth does this matter? If serverA have 1gbit and serverB have 100mbit, then the resulting bandwith between them is 100mbit. How can this result in some kind of oversubscription? Edit4: Wait, what. Active and passive ports? The VNX runs in a ALUA configuration with asymmetrical active/active.. there shouldn't be any passive ports, only preferred ones..

    Read the article

  • Is execution of sync(8) still required before shutting down linux?

    - by Amos Shapira
    I still see people recommend use of "sync; sync; sync; sleep 30; halt" incantations when talking about shutting down or rebooting Linux. I've been running Linux since its inception and although this was the recommended procedure in the BSD 4.2/4.3 and SunOS 4 days, I can't recall that I had to do that for at least the last ten years, during which I probably went through shutdown/reboot of Linux maybe thousands of times. I suspect that this is an anachronism since the days that the kernel couldn't unmount and sync the root filesystem and other critical filesystems required even during single-user mode (e.g. /tmp), and therefore it was necessary to tell it explicitly to flush as much data as it can to disk. These days, without finding the relevant code in the kernel source yet (digging through http://lxr.linux.no and google), I suspect that the kernel is smart enough to cleanly unmount even the root filesystem and the filesystem is smart enough to effectively do a sync(2) before unmounting itself during a normal "shutdown"/"reboot"/"poweorff". The "sync; sync; sync" is only necessary in extreme cases where the filesystem won't unmount cleanly (e.g. physical disk failure) or the system is in a state that only forcing a direct reboot(8) will get it out of its freeze (e.g. the load is too high to let it schedule the shutdown command). I also never do the "sync" procedure before unmounting removable devices, and never hit a problem. Another example - Xen allows the DomU to be sent a "shutdown" command from the Dom0, this is considered a "clean shutdown" without anyone having to login and type the magical "sync; sync; sync" first. Am I right or was I lucky for a few thousands of system shutdowns?

    Read the article

  • What's needed in a complete ASP.NET environment?

    - by Christian W
    We have a ASP3.0 application with a few ASP.NET (2.0) dittys mixed in. (Our longtime goal is to migrate everything to ASP.NET but that's not important for this issue) Our current test/deploy workflow is like this: 1 Use notepad++ or VS2008 to fix a bug/feature (depending on what I have open) 2 Open my virtual test-server 3 Copy the fixed file over, either with explorer, or if I can be bothered to open it, WinMerge 4 Test that the fix works 5 Close the virtual test-server 6 Connect to our host with VPN 7 Use WinMerge to update the files necessary 8 Pray to higher powers that the production environment is not so different that something bombs. To make things worse, only I have access to my "test-server". So I'm the only one testing it. I really want to make this a bit more robust, I even have a subversion setup running. But I always forget to commit changes... And I don't even work in my checked out folder, but a copy of what is currently in production... Can someone recommend some good reading on deploying, testing, staging and stuff like that. I currently use VS2008 and want to use subversion or GIT (or any other free VCS). Since I'm the only developer, teamsystem is not really an option (cost-related). I have found myself developing an "improved" feature, only to find a bug in the same feature in the production system. And since my "improved" feature incorporated deleting some old functionality, I have to fix bugs directly in production... That's not a fun feeling... (I have inherited this system recently... So it's not directly my fault that it is like this ;) )

    Read the article

  • Software/hardware to build video streaming server?

    - by Sasha Yanovets
    I am looking for a video streaming server solution, something like online TV server, with ability to make live broadcasts in the internet. What software could you recommend for that? What kind of hardware it should run on, should be there anything special? I am looking for a solution that could be scaled up to at least 1000 simultaneous users online with good resolution of video. I think it is good to have general answer on what direction to choose. But here more details on my specific case: I just looking for a solution almost from scratch. We have some video content that we've produced, but it is not delivered over internet yet. We do not tied to any particular vendor for now. We want to make 24 hours of steaming three 8 hour blocks with change of content every day. We want the ability to make regular live broadcasts. I guess we will need to have several options of streaming quality (low ~56 kb/s mid ~273 kb/s). Some terms just foreign to me (like play-truncation rate), if you could point out what parameters we should avare of, it would be great. Uplink to the internet is to be determined. We plan to start from something and scale up on the way. If you are already have some kind of media streaming server, just describe its configuration here (hardware, OS, software), peak number of concurrent users it serves. I think it could help people approaching this task.

    Read the article

  • Looking for a "light" compositing manager for GNOME

    - by detly
    I have an HP Pavilion DM3 (graphics is nVidia GeForce G105M), running Debian Squeeze with GNOME 2.30. My preference for DE is Gnome + Metacity + Nautilus. I'd like to use Docky, but it requires compositing. So I'm looking for a relatively "light" compositing manager. I realise that "light" is ambiguous, but I basically want something that won't chew through my notebook's batteries because of CPU or GPU usage. I know that Metacity is capable of compositing, but as far as I'm aware it's still testing. Some people report that it's smooth and lightweight, others claim that it eats up processor time. I've also seen references to a problem with nVidia, but no actual details. I'm not averse to Compiz, but I haven't used it before and I don't know what to expect in terms of "weight." And maybe there's something else I haven't heard of. So can anyone recommend anything? Or dispel my idea that Metacity is not the right tool for the job? (Originally posted on GNOME forums.)

    Read the article

  • TFTP Timing Out on Ubuntu VM

    - by valsidalv
    I'm running a Windows 7 PC with VMware installed which has my Ubuntu (10.04 Lucid Lynx). I recently installed a DHCP server and TFTP (Xinet tftpd) using these instructions. I've mapped a network drive so that my Windows has access to all the files in my VM through a 192.x.x.x IP address. I'm trying to throw some custom firmware onto a router. The router has its own built-in TFTP utility that will download the image. It successfully manages to do everything but it is slow because it writes it to flash memory. There is another method that is much quicker because it writes to RAM directly but it must use the TFTP server in Ubuntu. The issue I'm facing is that the Ubuntu TFTP transfer seems to be timing out. The transfer starts but never goes past ~60%. Here's my /etc/xinetd.d/tftp file (similar to a known working config): service tftp { protocol = udp port = 69 socket_type = dgram wait = yes user = nobody server = /usr/sbin/in.tftpd server_args = -s /home/user/tftp/ disable = no cps = 300 2 per_source = 60 } I've done some searching but can't find any parameters for this file to control timeout time or number of retries. The last two arguments (cps, per_source) and completely alien to me (can anyone explain). I have a few possible solutions but the easiest would be to get this TFTP server working. Can anyone help? Either with a timeout configuration or maybe even recommend a different TFTP server? Thanks!

    Read the article

  • Best usage for a laptop being used as a desktop without removable batteries

    - by Senseful
    After reading the information on http://batteryuniversity.com, I realize that one of the best ways to permanently damage a lithium ion battery is to use the battery at a high temperature while it's fully charged. This is exactly what happens when you use the computer as if it were a desktop computer, since leaving it plugged in will keep the battery at 100% and using the computer will heat up the battery. This is why it's recommend to remove the battery from your laptop if you are using it is this scenario. My question is what would you do if the laptop doesn't have removable batteries (e.g. a MacBook Pro)? Should I use some kind of charge cycle such as: charge to 80%, unplug the power chord, use the laptop until it reaches 20%, then repeat the cycle by charging to 80% again? If so, which values should I use instead of 80% and 20%? (I think charging to 80% is better than 100% because of the damage that a hot battery at 100% can do, but I just made the figure 80% up, and I'm sure there's a better number to strive for which is backed by science.) I've read many of the articles on batteryuniversity.com, but couldn't find anything pertaining to this. Update: What about doing something like charge (or discharge) it to 50%, then plug it in and turn on settings which use the battery as much as possible (e.g. brightness all the way up, wi-fi on, etc.), in order to try to maintain the battery at 50% (i.e. the rate it is charging is the same as it is discharging). This will probably heat up the battery, but would make it so you don't need to constantly plug and unplug the laptop. The one bad thing is that you are taking up more charge cycles which would decrease the battery life, thus I'm not sure this is a good idea.

    Read the article

< Previous Page | 122 123 124 125 126 127 128 129 130 131 132 133  | Next Page >