Search Results

Search found 22000 results on 880 pages for 'worker process'.

Page 543/880 | < Previous Page | 539 540 541 542 543 544 545 546 547 548 549 550  | Next Page >

  • Minimal Linux distribution with sshd and apt

    - by Sergey Mikhanov
    When I signed up for my Debian Linux VPS hosting and first logged on and invoked ps, there was the only user process running: sshd. As I can see, this was minimal Linux with only two things installed and configured: sshd and apt (plus all dependencies, of course). I want to build (or use existing) similar Linux distro, any advice on how to build (or pick) one? Googling "minimum linux", or "linux with sshd only" usually brings up Debian's netinstall, which is not what I want. Thanks in advance.

    Read the article

  • Browsing is much slower on one PC wired to the same router - why?

    - by deanalt
    Wife is not happy. It takes about 5 seconds to open a google window, versus about 1 second on the faster computer which is about 3 years old itself. Yes, it is an older computer (5 -6 years old, I'd guess), surely with less RAM, but for simple browsing, should it matter? Both are hardwired to the same Netgear Rangemax router. Both use fixed IP addresses. Both are XP. Both have about 8 feet of cable to the router. I have the fastest service my cable provides. Probably irrelevant but ...two newer MACs are connected wirelessely during the summer and they are even faster, but I think that's the difference in browsers. If you could point me to a list of process of elimination steps that would be most appreciated. Thanks Dean

    Read the article

  • Bad HD video deinterlacing processing

    - by Guy Fawkes
    I have Ubuntu 12.04 32-bit with Unity. My system configuration is: CPU: Core 2 Quad Q6600 (2.4 GHz) RAM: 8192 Mb DDR2 Kingston Video: Palit GeForce GTX 260 216 SP, and my screen resolution is 1680x1050. I also have Window 7 Ulitimate installed, and I can see the same files in Media Player Classic without any horizontal lines. I've installed vdpau driver, NVIDIA drivers 304.51, and MPlayer 2 (within SMPlayer). I've disabled "Sync to VBlank" option in CCSM (because in other way, by default, MPlayer process use about 50-60 percents of my processor load), tried to swich between different deinterlace options in SMPlayer, used "-vc ffh264vdpau,ffmpeg12vdpau" (without quotes) parameters for MPlayer, switched to "Ubuntu 2D", but, finally, have no results. Any suggestions? How must I to set up MPlayer?

    Read the article

  • TraceTune supports uploading Zip files

    - by Bill Graziano
    I’ve been using the online version of ClearTrace more and more lately.  When I get to a new client it’s just much easier to upload a trace file rather than install ClearTrace. That means I’ve finally been adding more features to it.  The two latest features are around ease of use. You can now upload a ZIP file that contains a trace file.  Trace files are already somewhat compressed.  Putting it in a ZIP file further compresses it by a factor of 8X or 9X in my testing. That means you can start with a 100MB trace and end up with a 10Mb-12MB ZIP file to upload.  I’m consistently able to get over 150,000 events in a 100MB ZIP file.  That gives me a pretty good look at a system. The second part of this is that files are now processed asynchronously.  After you upload a file you’ll be taken to a processing page that updates every few seconds with the number of rows processed.  It generally takes under a minute to process a 100MB trace file but I *hated* staring at a blank screen. Give TraceTune a try.  It’s getting easier to use every day.

    Read the article

  • Backup / Disaster Recovery, should I store RAR-compressed files?

    - by moraleida
    I'm in the process of recovering files from an accidentally formated Ext4 partition using Photorec. It had about 300Gb of data, of which I've already got hold of about 30Gb. So far, it seems to me that the recovery of RAR-compressed files has been much more successful than the recovery of individual uncompressed files and ZIP compressed files - in the sense that a lot of recovered files/zips were unreadable, and pretty much all of the RAR files were intact. Is there such a relation? Are RAR-compressed files really less prone to corruption and thus easier to recover?

    Read the article

  • How to generate new CSRs for TLS use in sendmail?

    - by Mikey B
    SendMail 8.13.8 | CentOS 5.x Hi Guys, I'm using ca-signed TLS certificates on my sendmail server and they are up for renewal soon. Our new CA doesn't like our old CSR so I need to generate a new CSR. Can someone point me to the procedure for doing this (without affecting the production certs that are already in use)? I'm paranoid of overwriting the old TLS certs in the process of generating a CSR. Most of the instructions I've found are for implementing self-signed TLS certs -- which isn't an option for me at this time. I'm thinking it would something like: openssl req -new -nodes -out new-tls.csr -keyout new-tls-private.key But I wasn't sure if I was missing some options there such as the -x509 option... -M

    Read the article

  • How to make a iOS plugin for Unity3d

    - by DannoEterno
    I've passed last 2 days reading articles and book for understand how can i make a plugin for iOS in Unity. Basically i need just a demo for understand how it work. For now i've tried to make this process (with really poor luck): I've started a new project in Unity and writed a simple script using UnityEngine; using System.Collections; using System; using System.Runtime.InteropServices; public class CallPlugin : MonoBehaviour { [DllImport ("__Internal")] private static extern int test(); void Start () { Debug.Log(test()); } } Then i've created a project in Xcode with this simple script: extern "C"{ int test() { int che = 5; return che; } } Then i've tried: to put the .mm and .h in the Assets/Plugins/iOS = nothing to build the unity project and than add the .h and .mm in the Xcode project = nothing In Unity i will always get the EntryPointNotFoundException, so unity see the file but is unable to reach the method. The problem is... how?! :) Maybe i miss something or i've done something wrong? Thanks a lot for every help that you can give me :)

    Read the article

  • How do I install the nvidia driver for a GeForce 9300M?

    - by Alex
    Yesterday i've installed ubuntu 11.04 (instead of 10.10). And i need to install nvidia driver which supports opengl 3.3 In 10.10 i did it in a such way: Ctrl+alt+f1 login sudo /etc/init.d/gdm stop sudo sh driver.run startx Not it doesn't work. Because ctrl+alt+f1 doesn't show login screen. Just only black screen. I've googled this problem. Some people have it but no one knows how to solve. Sometimes people say that it is connected with video card or driver. But i have GeForce 9300M G, and i've activated a standard driver. Anyway, it worked in 10.10, but doesn't work now. The main problem is that i need to kill xserver to install this driver. killing the process just only restarts xserver. Also, I've tried to /etc/init.d/gdm stop in GUI-console. It says that "Fake initctl,doing nothing". Google didn't helped in this case too. Any ideas to install that driver?

    Read the article

  • Quantifying the Value Derived from Your PeopleSoft Implementation

    - by Mark Rosenberg
    As product strategists, we often receive the question, "What's the value of implementing your PeopleSoft software?" Prospective customers and existing customers alike are compelled to justify the cost of new tools, business process changes, and the business impact associated with adopting the new tools. In response to this question, we have been working with many of our customers and implementation partners during the past year to obtain metrics that demonstrate the value obtained from an investment in PeopleSoft applications. The great news is that as a result of our quest to identify value achieved, many of our customers began to monitor their businesses differently and more aggressively than in the past, and a number of them informed us that they have some great achievements to share. For this month, I'll start by pointing out that we have collaborated with one of our implementation partners, Huron Consulting Group, Inc., to articulate the levers for extracting value from implementing the PeopleSoft Grants solution. Typically, education and research institutions, healthcare organizations, and non-profit organizations are the types of enterprises that seek to facilitate and automate research administration business processes with the PeopleSoft Grants solution. If you are interested in understanding the ways in which you can look for value from an implementation, please consider registering for the webcast scheduled for Friday, December 14th at 1pm Central Time in which you'll get to see and hear from our team, Huron Consulting, and one of our leading customers. In the months ahead, we'll plan to post more information about the value customers have measured and reported to us from their implementations and upgrades. If you have a great story about return on investment and want to share it, please contact either [email protected]  or [email protected]. We'd love to hear from you.

    Read the article

  • Lotus notes 8.5 quota

    - by Cividan
    we're using lotus notes 8.5 and I have a user who was over his quota as he had sent 6 email with attachement over 800 MB (no comment...) I deleted these oversized email and empty the trash but domino keep sending email warning about quota. I checked in the all documents view and they are no longer there, I re-did an empty the trash. I saw a post on the internet saying to compact his database, when I go under file, application, properties and click on the info tab, I see that he use 35.7% of the 3 GB database. when I click on "compact" I see a message saying the compact of the database is beeing process... the message disapear after about 1 minutes the message disapear but nothing else seem to happen and when I look back later on the space problem has not changed. any advice would be appreciated.

    Read the article

  • What is a good solution for an intranet video portal (YouTube-like) site?

    - by Ken Pespisa
    I would like an easy-to-setup site to handle videos to be viewed internally by my company. YouTube is essentially the perfect solution except for its being public. I'm looking for a place where a few people can upload videos, and the system will return a page where they can watch that video in a browser. I figure this would involve a dedicated Web server to run the Web application and process the videos. I've searched and I don't think such a system exists, but I perhaps there's one out there in its infancy that doesn't rank high on Google yet. Essentially the site I'm looking for is what MediaWiki is to Wikis, or what StackExchange is to Q&A sites, but for videos. Thanks in advance!

    Read the article

  • How to configure Nginx to serve a variety of back-ends via multiple FCGI processes?

    - by Ben Horton
    I've seen a lot of tutorials showing one how to set up PHP/Python/Perl/RoR on nginx via various FCGI processes. None of the tutorials that I found show one how to serve multiple FCGI services off one server. How would one configure the stable nginx (nginx-0.7.64) to serve multiple FCGI processes (one for each of the above languages)? Example addresses for each FCGI process are as follows: 127.0.0.1:8080 - PHP 127.0.0.1:8081 - Python 127.0.0.1:8082 - Perl 127.0.0.1:8083 - Ruby on Rails An example configuration file that shows one how to implement multiple FCGI's off one server is really what I need. Perhaps others will benefit as well.

    Read the article

  • Windows XP - non-user input data filter message after installing wireless keyboard & mouse

    - by James
    After I installed MS wireless keyboard and mouse and associated software, I started getting this annoying message titled "Hardware installation" telling me the software I am trying to install did not pass the XP logo tests. The software is for "HID non-user input data filter" and I have two options Continue anyway or stop installation. Now, if I try to continue the installation fails, if stop installing another message pops up with a little mouse logo and the whole process repeats itself. after I am done with that message a third dialog appears. This is happening every time I boot up my PC (a desktop), I tried following an advice I found in some forum and download windows update for ID non-user input data filter, but that installation failed as well. The thing is, that both keyboard and mouse are working fine Is there anyway to get past these dialogs ?

    Read the article

  • LiveCD/USB boot issues with Ubuntu 12.04 on blank drive

    - by Richek
    Not sure how common this issue is, or even how badly I may be missing something simple, but I am a first time usuer having some serious problems. Some background: old HDD running Windows 7 developed too many bad sectors and is bricked. I'm attempting to install Ubuntu 12.04 on a fresh 1TB drive by booting from a liveCD USB flash drive. I've not been able to get past the initial menu screen, however, as the process stalls out shortly after selecting an option (both boot from drive and install to drive). I've tried multiple USB drives as well as CDs, modified the boot order, flashed BIOS, and even tried booting with only the flash drive and the keyboard connected with the same results.Typically what I observe is that the OS begins what I think is compliling, listing drivers and components before freezing on one. When the keyboard is plugged in, its the keyboard driver, before I flashed BIOS, it was a BIOS related item, now its an unknown entry. The computer seems to be reading the drive (idicated by USB light flashing or CD drive reving) for roughly 10 minutes with no progress, followed by the drives going quiet. Some spec info: Motherboard: ASUS P5Q Pro, BIOS version 2102 (latest version), Intel chipset CPU: Intel Core 2 Duo E8400 Wolfdale 3.0GHz help would be appriciated!

    Read the article

  • Not able to safely remove external disk after having mounted and unmounted a VHD on it

    - by Agnel Kurian
    I am using Windows 7 SP 1. I have an external hard disk (Seagate 500GB) which I am able to use without problems most of the time. I am able to plug it in, use it and then safely unmount it via the "Eject USB Mass Storage Device" option in the taskbar tray. However, if I attach a VHD file located on this disk using "Disk Management", then detach the VHD and finally try to safely disconnect the disk via the system tray, I get an error which says: "Problem Ejecting USB Mass Storage Device: Windows can't stop your 'Generic volume' device because a program is still using it. Close any programs that might be using the device, and then try again later." How do I avoid this problem? Which process could still be accessing the device (even after I have closed the "Disk Management" application) ?

    Read the article

  • SSD I/O extremely slow installing/booting Ubuntu 12.04

    - by Menda
    These are some useful specs: Macbook Pro 7,1 OWC Mercury Extreme Pro 2,5" SATA SSD (120 GB). Has SandForce driver. Ubuntu 12.04 Desktop 32 bits. One 18 GB partition for GNU/Linux and 1.5 GB for SWAP. MD5 for the Ubuntu install CD is OK. I tried to install Ubuntu. It seems that everything is recognized, but there's a big problem: read and writes to the SSD are extremely slow. For example, the install process, which shouldn't take more than 20 minutes, it takes 7 hours. Then, booting up the computer takes about 20 minutes. I checked and the problem is definitely the SSD. Every access to any file is like 10 times slower than normal. I have tried to format the partition as Ext4 and Ext3 with the same problem. Trying to install other distros like Fedora 17, I have a similar problem. There's a "lag" with the SSD, but not so accused as in Ubuntu. Surprisingly, Debian 6.0 installs and works without any problem. Mac OS works pretty good as well in the other partition, so I discard it's an SSD problem. Thanks for your help!

    Read the article

  • Processing files from a Content Distribution Network problem

    - by Derek
    From what I understand that CDNs are meant to physically cache your static files in multiple regions closer to your users. However, I've noticed a few websites that when a page is requested from their server, they grab the asset files from their cdn, process them (compress, minify, etc.) cache the results on their server and then send them to the user requesting the page. This doesn't make too much sense to me. Wouldn't processing the files on your server eliminate the gains from using a cdn? Is this a normal way of doing things, or am I not understanding the whole asset management concept?

    Read the article

  • RESTful applications logic and cross resource operations

    - by Gaz_Edge
    I have an RESTful api that allows my users to receive enquiries about their business e.g. 'I would like to book service x on date y. Is this available?'. The api saves this information as a resource to the following URI users/{userId}/enquiries/{enquiryId} The information shown when this resource is retrieved are the standard sort of things you'd expect from an enquiry - email, first_name, last_name, address, message The api also allows customers to be created for a user. The customer has a login and password and also a profile. The following URIs expose these two resources PUT users/{userId}/customers/{customerId} PUT users/{userId}/customers/{customerId}/profile The problem I am having is that I would like to have the ability to allow users to create a customer from an enquiry. For example, the user is able to offer their service on the date requested and will then want to setup a customer with login details etc to allow them to manage the rest of the process. The obvious answer would be to use a URI like users/{userId}/enquiries/{enquiryId}/convert-to-client The problem with this is is that it somewhat goes against a lot of what I've been reading about how to implement REST (specifically from the book Restful Web Services which suggests that URIs should point to resources not operations on resources). The other option would be to get the client application (i.e. the code that calls the api) to handle some of this application logic. This doesn't quite feel right to me. I have implemented in my design that the client app is fairly dumb. It knows just enough to display the results from the API, and does not contain any application logic. Would be great to hear what others views are on the best way of setting this up Am I wrong to have no application logic in the client app? How would I perform this operation purely in the REST api?

    Read the article

  • How to remove large number of files/folders in linux

    - by user1745713
    We are using hadoop to split a table into smaller files to feed to mahout, but in the process, we created a huge amount of _temporary logs. we have an nfs mount for the hadoop volume so we can use all the linux commands to delete folders files, but we just can't get them to be deleted, here's what I've tried so far: hadoop fs -rmr /.../_temporary : hangs for hours and does nothing on nfs mount: rmr -rf /.../_temporary :hangs for hours and does nothing find . -name '*.*' -type f -delete : same as above the folders look like this (38 of these folders inside _temporary): drwxr-xr-x 319324 user user 319322 Oct 24 12:12 _attempt_201310221525_0404_r_000000_0 the content of these are actually folders, not files. each one of those 319322 folders has exactly one file inside. not sure why the do the logging this way. Any help is appreciated.

    Read the article

  • 500 internal server error

    - by Rockr
    I am facing 500.0 Internal server quite frequently with my website. The error details are given below. HTTP Error 500.0 - Internal Server Error C:\PHP\php-cgi.exe - The FastCGI process exceeded configured activity timeout Module FastCgiModule Notification ExecuteRequestHandler Handler PHP_via_FastCGI Error Code 0x80070102 Requested URL http://mydomain.com:80/index.php Physical Path C:\HostingSpaces\coderefl\mydomain.com\wwwroot\index.php Logon Method Anonymous Logon User Anonymous When I contacted the support team, they're saying that my site is making heavy SQL Queries. I am not sure how to debug this. But my site is very small and the database is optimized. I'm running wordpress as platform. How to resolve this issue?

    Read the article

  • Is it possible to change the motherboard/bios vendor name? [closed]

    - by vignesh4303
    Possible Duplicate: How can I change my BIOS splashscreen? When we start the system we normally used to get the motherboard/vendor name for e.g i'm having the system with mercury motherboard whenever i restart my system the mercury logo will get displayed(you ll have respected vendors) and the regular process will go on .. my question is Is it there any possibility to change the name and logo of mercury(on my system &it differs based on your motherboard you are using ) on start up? if you are an laptop user "you can see the laptop vendor name e.g "dell" " For asus user's refer Bon Gart's comment ,I am in search of answer those whom feel frustrated about the motherboard/bios splash screen and whom feel to change it

    Read the article

  • Installing Visual Studio 2010 SP1 or Windows Phone tools in your VM (danger!)

    - by Jeff
    If you've read my blog for any amount of time, you probably know that I tend to develop stuff in a Parallels VM on a Mac. It's how I roll. I like VM's because I can trash them and do really stupid things with beta software. That said, there is a pain point that doesn't seem that well documented when it comes to installing stuff in this scenario.The WP7 tools, and SP1 for Visual Studio 2010 (perhaps only if you already have the WP7 tools installed, I'm not sure), do something strange on install. As if it weren't already a long and slow installation, for reasons I don't understand, the installer fires up an instance of Windows Phone Emulator. As you may already know, the emulator doesn't run in a VM, because it is itself a VM, apparently. What it will do is fire up your CPU, make your comprooder hot and make the fans blow harder.I found this out accidentally, as I started the (slow) phone tool installation once, and walked away. An hour and a half later, I came back to find it hadn't finished. But it was hot and the CPU was pegged, so I fired up the task manager to find XDE.exe, the phone emulator, cranking away. I had to kill it several times, and eventually the install finished. It fired up just once in the SP1 install, but it still had the same hanging effect.I can't for the life of me figure out why it does this. In a VM, I can connect the phone to it and use that, so I don't need the emulator. But this install, firing up the emulator, will make it choke until you kill the XDE.exe process. Watch out!

    Read the article

  • Splitting Multiple Files in Windows

    - by Justin Boucher
    We have a 21TB LUN full of images that are approx 600K in size in multiple sub folders on the disk. We are trying to split the 21TB LUN into 8 smaller LUNs that are about 2.6TB a piece in order to process the images more effectively. My question is how we can determine what 2.6TB is on the drive? What is the best tool to mark this data so we can copy it to the new smaller LUNs with robocopy or emcopy without overfilling the smaller LUNs? Is there a third-party tool that would be better suited for this task? Thank you in advance for your assistance.

    Read the article

  • Offshoring: does it ever work?

    - by DanSingerman
    I know there has been a fair amount of discussion on here about outsourcing/offshoring, and the general opinion seems to be that at best it is difficult, and at worst it fails. I have direct experience of offshoring myself; a previous company where I was a dev manager wanted to send some development offshore, and we ran a pilot scheme to see how well it would work. Of course it was a complete failure, although it is not completely clear to me whether this was down to the offshore devs being less talented, the process, or other factors (no doubt it was really a combination). I can see as a business how offshoring looks attractive (much lower day rate), but as far as I can see, the only way it could possibly work is if you do exceptionally detailed design up front, with incredibly detailed specifications; and by the time you have invested in producing that, you have probably spent as nearly as much as if you had written the actual code locally (which I think is an instance of No Silver Bullet) So, what I want to know is, does anyone here have any experience of offshoring actually working ever? Especially if there are any success stories of it working in a semi-agile way? I know there are developers here from all over the World; has anyone worked on an offshore project they consider successful?

    Read the article

  • Best idea dataserver serving small pictures 40 ko

    - by Nicolas Manzini
    I'm designing the server structure for my application in case things go well. I have one server DB connected to multiple server who process connections. All those with lots of RAM and fast processors. (still looking for a way to use the multithread because now it's dumb apache php... so loooots of ram needed). Upon an answer from those servers, the client can then connect to another server to retrieve pictures using the address he previously got from the db. Is it a good idea to have one database server with let's say nginx and ssd disk having to send all pictures to everybody? or should I have multiple server accessing to a shared ssd disk drive or multiple disk updating each other? Also should I put a lot of RAM on the database server? because probably there wont be a picture more popular than another.

    Read the article

< Previous Page | 539 540 541 542 543 544 545 546 547 548 549 550  | Next Page >