Search Results

Search found 10023 results on 401 pages for 'manage processes'.

Page 203/401 | < Previous Page | 199 200 201 202 203 204 205 206 207 208 209 210  | Next Page >

  • macports apache,php,db, how do I test on another device?

    - by brokenindexfinger
    My supervisor suggests using macports to install/manage different versions of apache and php, as well as both mysql and posgres databases. The idea is that we need to test our platform on different versions of each. So far I've just been using the default apache installation on osx lion, and the default postgres installation. My question is this: once I turn Web Sharing off, and proceed with a custom apache2 setup based in /opt/local/, how do I broadcast my machine's IP to other devices, for testing? With Web Sharing, I can get my machine's IP and use that to test with an iPad and iPhone. Will that still be the case, and if so, how do I do it?

    Read the article

  • Launch script after SFTP disconnect

    - by Mates
    I'm currently using Caja (basically the same as Nautilus) to connect using SSH to my server and work with files. What I'm looking for is a way to launch a simple script when I disconnect - I can launch a script after disconnecting from the TTY by putting it into ~/.bash_logout file, but that is not executed when disconnecting from a file manager. The only idea I have is to set up a cronjob which would be checking for existing sftp-server or sshd processes periodicaly and launched the script when there's no such process running. Is there any easier way to do this?

    Read the article

  • Can't access to a iSCSI volume

    - by jmiguel.rodriguez
    I have a iSCSI target on a customer place I'm using from an old Fedora (Core6) server. I configured it and formatted as ext3 (mistake, now I know) and I've been working with it for some time. Now I need to access this volume from other machine. As far as I've read, I can't do it safely from two machines at the same time (yep, that's the first thing I tried). So I've umount it from original server and tried to mount it on the new server (I did it at first with Ubuntu 10 LTS but when I was unable to do it I installed another Fedora with the same configuration) with no success. The problem: I can see all target on NAS but when I do a "fdisk -l" to see all devices and know which mount I see all targets as SFS filesystem. From the original server I see all SFS (after all, they belong to my customer and don't know what he have in) except the one I manage which I see as 'Linux'. What can I do? Thank you in advanced, regards, jmiguel

    Read the article

  • recommendations for disk -> usb backup software

    - by TWood
    Recently I lost a tape drive and rather than repair the unit I decided that backups to usb external drives would be cheaper. In the past I used NTBackup and figured that the new server 2008 R2 backup wbadmin utility would be able to meet my needs. It does not. I am looking for recommendations for another utility that i can use. My requirements are: -backup local disk in addition to files on a network share -scheduled task integration (or some gui options to manage schedule) -non-incremental backup Basically I could do this all with WBAdmin if it just supported network shares. I saw some links that described attaching a vhd pointed to a network share but I am trying to avoid hacks like that. If i'm going to do all that trouble I'd just as well manually copy the directories over myself. If anyone has any software suggestions that might make this task easier for me let me know please. I am considering BackupAssist but can only find a few reviews here and there for it.

    Read the article

  • Reading a file from an alternate location

    - by Highstaker
    I have a certain file (data.abc) located in, say, my home folder. I make a copy of it to another location (for example, "/mnt/ramtemp/"). Whenever the file in my home folder is accessed by any process, I want it to be read not from home folder, but from "/mnt/ramtemp/". As you might have guessed from the path of the latter, it is where I mount the ramfs. So, basically, I want a process to access not the file on my HDD (which is slower), but its copy on ramfs (which is way faster). At the same time, I want the file data.abc to remain in my home folder under that name, I don't want to rename or delete it. Is there any way I could guide the system to redirect the processes to read the file from alternative location whenever they try to read it from home folder?

    Read the article

  • How to keep programs from source up to date?

    - by wizard
    I'm designing a new server setup for hosting multiple websites. (Shared hosting for my clients over at SliceHost.) I've recently moved away from the traditional LAMP setup and chosen Ubuntu, Nginx, php-fpm and mysql. I like it a lot better then my old Apache, suphp, mysql setup. It works great, provided encapsulation between sites and uses substantiallly less memory. However I have one major maintenance problem. In order to have a recent version of Nginx and in order to use php-fpm I've had to compile these programs from source. The reason I see this as a problem is that keeping track of updates, and build configurations will end up being a lot of work. For two programs (and a patch) I can handle it, but it seems like this setup would not scale with many packages and servers. Are there good ways to manage this situation? I'm sure people do this all the time.

    Read the article

  • APC on PHP 5.4 does not seem to be installed after installation

    - by Burning the Codeigniter
    I've recently upgraded to PHP 5.4 from 5.3.6, I did the command apt-get upgrade php5 with the custom PHP 5.4 repo which I added to the apt-get repositories, now that I upgraded, I restarted php-fastcgi and php5-fpm the APC does not seem to be installed with it after I did pecl install apc it seems to configure and install with the details below: Configuring for: PHP Api Version: 20090626 Zend Module Api No: 20090626 Zend Extension Api No: 220090626 But in my phpinfo() I get this: PHP API 20100412 PHP Extension 20100525 Zend Extension 220100525 Which I don't understand, how can I configure PECL to install with PHP 5.4 with my version, my installation with apc.so is stored to /usr/lib/php5/20090626/ however in /usr/lib/php5/ I have two PHP versions: 20090626 20100525 How can I remove either one and leave PHP 5.4 and manage it to install apc in the correct PHP version? I'm running Ubuntu 11.04 on my server. I need help on this please.

    Read the article

  • Assembling a number-crunching computer [closed]

    - by tugrul büyükisik
    What is needed to make a GPU fully fed by CPU? Comparing their flops/s is enough? For example, if i could manage to make a very old(pentium-3) CPU with one of Nvidia-Fermi GPU, it would not be able to fed the gpu with data per sec. What is the criteria to fit CPU to GPU exactly when OpenCL or some similar work needed? Of course RAM and BUS will be chosen in a similar way but how exactly? Assume each GPU-core will calculate a sqrt and a division and an adding for 100 times for every itearation. Thanks.

    Read the article

  • Live Mesh beta on Windows 7 - Did anybody get it working?

    - by Andrew J. Brehm
    I have Live Mesh beta installed on 64 bit Windows 7 but cannot connect to the computer remotely. The "Manage devices" page gives this information: This device is synchronizing but cannot be remotely accessed. Check the Live Mesh notifier on the device to see if Live Mesh remote desktop enhancements need to be installed. However, I cannot find instructions on how to use the "notifier" to "see" if Live Mesh remote desktop enhancements are installed or how I could install them. When I originally installed (both times) Mesh I did tell it to install "remote desktop enhancements". Does this just not work with Windows 7 and/or 64 bit?

    Read the article

  • How to bridge two networks via VPN (IPsec)?

    - by polemon
    I'd like to do a Site-to-Site bridging with VPN (IPsec), how do I do that? On the local side, I have a DrayTec Vigor2910, it is supposed to be able to manage IPsec tunnels. Anyway, I need to have several VPN tunnels to various sites, but how exactly do I do that, If the only router I can configure, is the local one? As I understand it, I'd need some sort of VPN server or client, or whatever on the other side. In any event, please clarify that issue. Thanks.

    Read the article

  • Cannot access to my app's dashboard (iTunes Connect)

    - by Skynext
    Since more than a week I can't access to my app's admin page on iTunes Connect. I'am able to connect me to iTunes Connect and to access the "manage my Apps" section but when I select a specific app I get the following message: Unable to Process Request Your request could not be processed. For additional help, send an email to [email protected] I have deleted cookies and data navigation in Safari, tried with chrome and fierfox: nothing. I contacted the support of iTunes Connect more than one week ago but nothing moves. If anyone has experienced the same situation and can help me... Thank You !

    Read the article

  • What sort of things can cause a whole system to appear to hang for 100s-1000s of milliseconds?

    - by Ogapo
    I am working on a Windows game and while rendering, some computers will experience intermittent pauses ("hitches" for lack of a better term). When profiled they appear in seemingly random places in the code. Eventually I noticed that it wasn't just my process that was affected, but (seemingly) every process on the system. All of the threads in my application hitch at once. The CPU utilization drops during these hitches and it appears as if most processes make no progress. This leads me to believe this may be an Operating System or Driver issue, but it only occurs while playing the game (and only on some systems). What sort of operations might the operating system be doing that would require the kernel to pause all user threads and block. Some kind of I/O? At first I thought of paging but my impression is that would only affect a single process, no? Some systems in use: Windows, DirectX (3d), nVidia cards (unknown if replicates on ATI), using overlapped io for streaming

    Read the article

  • What causes Remote Desktop Services Manager to crash in Server 2008 R2?

    - by milkmood
    I have this consistent problem of RDSM crashing in Server 2008 R2. It is either really slow to open, sometimes never opens, or after it's been open and working properly for a bit, stops working, and forces an unload of the snap-in. It's done this since the deployment of this server, new hardware, new instance of S2k8. Domain Administrator login. I am using it to manage 3 Terminal Servers, the other two are S2k3. I've used it without issues on other 2008 servers.

    Read the article

  • How to run a command in a process that is not a child of the current process?

    - by amicitas
    I am having a library conflict issue with calling an external program from within a interpreted programming environment (IDL). The issue seems to be that since the program I am calling ends up as a child of IDL, libraries are not being reloaded. From within IDL I can launch sub-processes either directly or using a shell. Is there a good way that I can cause my program to be run without ending up as a child process? The only solution I have found so far is to use ssh localhost my_program. This works perfectly but I would like a more direct solution.

    Read the article

  • An international mobile app - Should I set up EC2 instances in multiple regions?

    - by ashiina
    I am currently trying to launch an mobile app for users around the world. It is not a spectacular launch which will get millions of users in weeks - just another individual developer releasing an app. I know enough about the techniques of managing timezones, internationalizing string, and what not ( the application layer ). But I cannot find any information on how I should manage my EC2 instances... Should I be setting up EC2 instances in different regions around the world? Is that a must-do, or is it an overkill? I'm aware that it's the ideal solution in terms of performance, but it becomes very tough managing servers in multiple regions. DB issues, AMI management, etc... I'd much rather NOT do so. So I would like to know the general best practice when launching an international app/website. Note: For static contents, I know it's better to use a CDN, so I'm planning on doing so.

    Read the article

  • What's the proper way to change a process' scheduling policy to IDLE?

    - by ??O?????
    Hello. I have a long running process on a server running Ubuntu Server 9.10. I would like to make it run under the SCHED_IDLE policy using the chrt command. However, after reading the man page, I can't manage to understand the proper way to issue the command for a running process. I've tried unsuccessfully: # chrt -i -p 688 pid 688's current scheduling policy: SCHED_OTHER pid 688's current scheduling priority: 0 # chrt -p -i 688 pid 688's current scheduling policy: SCHED_OTHER pid 688's current scheduling priority: 0 # chrt -p 688 -i chrt: failed to set pid 0's policy: Invalid argument I'll keep trying, but do you know how to do what I want?

    Read the article

  • How To Check My Current Version of FFMPEG

    - by aamiri
    I have FFMPEG installed on 2 different servers. On one of the servers, i run into an issue every time i try to convert m4v files where ffmpeg just processes the file indefinitely. When I take the same source file and try to run it on the other server it seems to work just fine. Both servers are running the same version of GNU/Linux. Some one suggested i check to see if the same version of ffmpeg is installed on the servers, so my question to you all is, "how do i check my ffmpeg version?" Thanks!

    Read the article

  • Problems getting auditd set up on my server

    - by Tola Odejayi
    I'm trying to figure out which processes are deleting files from a specific directory, so I want to set up and run auditd on my system. I've set up the following rule in audit.rules: -w S unlink -S truncate -S ftruncate -a exit,always -k cache_deletion -w /home/myfolder/cache Then I type this to start the audit daemon: auditctl -R /etc/audit/audit.rules -e 1 But I get this error message: Error - nested rule files not supported Does anyone know what I am doing wrong here, and how I can resolve this? Also, what do I have to do to get the daemon running at startup?

    Read the article

  • Disabling certain JBoss ports

    - by Rich
    We are trying to configure JBoss 5.1.0 to be as lightweight and as secure as possible. One of the parts of this process is to identify and close any ports we do not need. Three ports that we have outstanding but don't believe we need are: 4457 - bisocket 4712 - JBossTS Recovery Manager 4713 - JBossTS Transaction Status Manager We don't think we need any of these features (but could be wrong). Bisocket seems to be a way for JMS clients behind a firewall to communicate with JBoss. We hardly use JMS now and when we do, it is very unlikely that we will need this firewall traversing ability. I am less sure about whether we need the two JBossTS ports - I am guessing these are used in a clustered environment - we aren't clustered. So my question is, how do we disable these ports (and associated processes where possible), or if we need these ports, why do we need to keep them open?

    Read the article

  • Dependency diagramming / mapping tool [closed]

    - by Lars
    I am looking for a tool that allows me to easily create and maintain dependency maps of our mission critical servers, apps, processes, etc. It needs to be intuitive and easy to work with and be able to generate diagrams that clearly show the dependencies graphically. What would be some good tools for this? I have looked at videos for AssetGen Sysmap and BluePrint from Pathwaysystems.com, and they both seem to fit my needs, but there has got to be more good systems like them that I can look at. I want to make sure I pick the best system for our needs (and limited budget).

    Read the article

  • How do I remove Lenovo Veriface from the login screen?

    - by Xenorose
    I have a Lenovo laptop which came preloaded with Windows 7. Every time I start the computer and get to the Windows log-in screen (where you enter the user password) I get a message about Veriface software giving me the option to use it. I'd like to disable this. I went over the Program's settings and there is nothing that allows you to disable it from loading with Windows. Also, I thought that it might be a service to disable, but I don't see it in the list of services, nor is it in the list of start-up Processes (either in msconfig or in the registry). I'm considering uninstalling it completely, but since it's a part of the lenovo software pack that came with the computer and I do use some of these software, I'm not sure if uninstalling it might also remove wanted things (and uninstalling and reinstalling if needed seems like a mess). Anybody know if there's an easy way to achieve this?

    Read the article

  • Graphing per-user CPU usage on a Linux machine

    - by mart1n
    I want to graph (graphical output would be great, i.e. a .png file) the following situation: I have users A, B, and C. I limit their resources so that when all users run a CPU intensive task at the same time, those processes will use 25%, 25%, and 50% of CPU. I know I can get the real-time stats using top but have no idea what to do with them. I've searched through the huge top man page but haven't found much on the subject of outputting data that can be graphed. Ideally, the graph would show a span of maybe 30 seconds. Any ideas how to achieve this?

    Read the article

  • Cloned Windows 7 to new HDD and want to change the drive letter to C

    - by Hoppe
    I used Clonezilla to clone my existing hard drive to a new one I bought. I then changed the BIOS to set the new drive as the first in the boot sequence. I'm pretty sure that I'm still running Windows 7 on the old drive. My old drive is marked as C. Now that I don't have a disk drive any more, how I do I swap the drive letter from J: to C:? I tried to change it in the disk management section of "Manage", but it reports: "the parameter is incorrect".

    Read the article

  • what do you use for storage discovery / storage management?

    - by lysdexic
    I am looking for some ideas on what would be the best way to discover and manage network storage. Discovery: Any good tools that will scan hosts and storage devices and report back their findings? Maybe using SNMP or WMI? Management: I'm currently looking at Storage Manager by Solar Winds, but it is a bit pricey. Any good open source projects like this? Just looking for ideas of how to get better visibility into the storage infrastructure on this network. It includes a HP Lefthand Iscsi san and EC VNX san. As well as individual hosts with local storage. Thanks for reading and thanks for any input.

    Read the article

  • Performance Drop Lingers after Load [closed]

    - by Charles
    Possible Duplicate: How do you do Load Testing and Capacity Planning for Databases I'm noticing a drop in performance after subsequent load tests. Although our cpu and ram numbers look fine, performance seems to degrade over time as sustained load is applied to the system. If we allow more time between the load tests, the performance gets back to about 1,000 ms, but if you apply load every 3 minutes or so, it starts to degrade to a point where it takes 12,000 ms. None of the application servers are showing lingering apache processes and the number of database connections cools down to about 3 (from a sustained 20). Is there anything else I should be looking out for here?

    Read the article

< Previous Page | 199 200 201 202 203 204 205 206 207 208 209 210  | Next Page >