Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 620/1620 | < Previous Page | 616 617 618 619 620 621 622 623 624 625 626 627  | Next Page >

  • How to spawn-fcgi multiple fcgi processes ?

    - by Shrinath
    We have nginx installed and would like to spawn-fcgi multiple ".fcgi" files. The programs were written in C. How do we spawn all the files at one go ? Edit This is the scenario : I have 3 different programs to serve. Lets say, I've search results from google, yahoo, bing. I want to show 3 columns which host results of above providers. I have 3 fcgi scripts, one for each provider. How do you suggest I put all 3 into action ?

    Read the article

  • Windows 7 The boot selection failed because a required device is inaccessible 0xc000000f

    - by piratejackus
    I have a problem with my Windows 7, hardware : Acer 3820TG Operating Systems : Windows 7 and Ubuntu 10.04 dual Case: When I try to boot my windows 7 I see an error: "Window failed to start. A recent hardware or software change might be the cause. To fix the problem: 1.Insert.... 2. .... ... status : 0xc000000f info : The boot selection failed because a required device is inaccessible .... " I can't exactly remember what were my last actions on Windows. I already searched this error and applied the proposed solutions, I created a repair USB (because I don't have a CD-ROM nor a Windows 7 CD) such as; -repair operating system :it says it cannot repair it -checking disk (chkdsk D: /f /r) : it checks the disk without a problem or error and it takes pretty long (more than a hour). But when I restart, still the same error. -I didn't create a restore point so I pass this option -I don't have a system image -I tried to run windows recovery (I have a recovery partition) but there are just two options: 1- Format the operating system but retain user data (copies the files under users to c\backup folder, but when I searched deeper I found that there are some people who already tried this option and couldn't find their user files under backup directory). Plus, I have unfortunately just one partition D (it is a fault I know) because I use always Ubuntu. So this is not applicable in my situation 2- Format entire system (Windows). I keep my valuable data in windows but not in user folder. I was reaching them from Windows. -I tried to repair windows boot by: bootrec /fixMBR bootrec /fixBoot bootrec /rebuildBCD I lost all grub menu, and reinstalled it. - ubuntuforums.org/showthread.php?t=1014708&page=29 nothing changed, same error. I created a thread in microsoft forums - http://social.answers.microsoft.com/Forums/en-US/w7install/thread/69517faf-850a-45fd- 8195-6d4ed831f805 but I couldn't find a solution. Before I run chkdsk from usb repair disk I couldn't able to mount Windows (NTFS) partition from Ubuntu, I was getting "couldn't mount file system, error code 2". I tried to fix ntfs partition from ubuntu and got "segmentation fault". I also created a thread on ubuntuforums for this mount problem: - http://ubuntuforums.org/showthread.php?t=1606427 So, after chkdsk, I could enable to mount windows partition but all I see in this partition is chkdsk logs, no any other data. Now, I don't think I lost my data because I don't get any filesystem errors, just the boot section, but this log files under windows partition makes me afraid. I see that Microsoft developers don't have a solution yet for this error. If you need any information to get more idea I can give, maybe I miss some points or it could be complicated. Thanks in advance.

    Read the article

  • Recovering OST file without profile

    - by Philippe
    We have a Microsoft Exchange Server 2007 and offer this solution to many customers. Recently, a customer had his personal Exchange server crash (which is what made him our customer). He called some technician to see if he could repair his server before calling us, but this said tech wasn't able to do anything for them. Now that all his mailbox are on our server, he would like to transfer his old emails over to the new profile, but the tech deleted all profiles on the client machines while trying to repair his Exchange server. So my customer still has the OST files but they are not related to any profiles. Is there any way to re-attach them to a profile, or to convert them into PST files that he could then import into his new profile? The only thing I found were third party software that could to the conversion, but I am wondering if Microsoft has any tools that could re-attach the OST to a new profile. I have also tried the scanpst.exe and scanost.exe to no avail. Thank you

    Read the article

  • Best way to use my windows box as a backup?

    - by user29336
    I put a 1.5 TB HD in my Windows 7 box and my main computer is a MBP. I have a lot of professional files/folders in a FireWire 800 external HD connected to the MBP and I want to use the 1.5 TB HD in my Windows 7 box as a backup for both the HD and MBP. Right now I am just copying files manually to the HD over the network and that's very slow and open to failure (not rsync'd.) Anyone suggest some appropriate solutions? Should I just figure out how to setup RSync on the windows box or is there a better alternative? Thanks!

    Read the article

  • How to add assemblies in a 64-bit machine?

    - by marko
    My old cmd-script: C:\Windows\Microsoft.NET\Framework\v2.0.50727\RegAsm blabla.dll C:\Windows\Microsoft.NET\Framework\v2.0.50727\GacUtil -i blabla.dll (Which works fine in my old machine.) But now I have a script for a 64-bit machine (Windows Server 2008 R2): C:\Windows\Microsoft.NET\Framework64\v2.0.50727\RegAsm blabla.dll C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\NETFX 4.0 Tools\GacUtil -i blabla.dll Then I get this message: C:\Windows\Microsoft.NET\Framework64\v2.0.50727\RegAsm blabla.dll Microsoft (R) .NET Framework Assembly Registration Utility 2.0.50727.5420 Copyright (C) Microsoft Corporation 1998-2004. All rights reserved. Types registered successfully C:\Program Files\Microsoft SDKs\Windows\v7 .1\Bin\NETFX 4.0 Tools\GacUtil -i blabla.dll 'C:\Program' is not recognized as an internal or external command, operable program or batch file. The second command is not successful.

    Read the article

  • write-through RAM disk, or massive caching of file system?

    - by Will
    I have a program that is very heavily hitting the file system, reading and writing randomly to a set of working files. The files total several gigabytes in size, but I can spare the RAM to keep them all mostly in memory. The machines this program runs on are typically Ubuntu Linux boxes. Is there a way to configure the file system to have a very very large cache, and even to cache writes so they hit the disk later? I understand the issues with power loss or such, and am prepared to accept that. Crashing aside, in normal operation the writes should eventually reach the disk! Or is there a way to create a RAM disk that writes-through to real disk?

    Read the article

  • Howto manage website and user permissions with apache on Mac os X

    - by Sander Versluys
    I have lots of different websites in a Development directory in my home dir. While developing, files get saved under my username but websites configured under apache it's permission need to be set as _www user and group. What's the best way to handle this? Do I run apache under a different user/group? Do I run my development tools under a different user? Do add myself to the _www group? (seems like it doesn't work btw) I've just switched to a mac and I'm trying to find a smooth development workflow, so it would be best if i could just run the necessary tools, save some files and be able to test the website without much hassle. Thanks!

    Read the article

  • How can I create a custom OpenOffice / LibreOffice Writer table AutoFormat scheme?

    - by Merlyn Morgan-Graham
    None of the basic table AutoFormat schemes in LibreOffice Writer have both an alternation style defined and no sum column/row style defined. If they have alternation, they always seem to have sums. Because of this I'd like to define my own table scheme. What is the easiest way to accomplish this? A WYSIWYG isn't totally necessary. I am not scared of editing simple XML files as long as I have examples to work from, and if I don't have to edit base install files. If I can place them in a custom area or my user profile directory then that would be best. If there is a way to get the GUI Add functionality to properly recognize an alternation then that would also be helpful.

    Read the article

  • Is there a quick way to create a Windows shortcut to a file without it validating the path?

    - by Alistair McMillan
    Trying to create a shortcut for someone else. It needs to point to files on one of their mapped drives. Instead of waiting for them to be available and create the shortcut on their PC, I was hoping to just create the shortcut on my PC and then transfer it over to their PC. However Windows tries to validate the path as you create the shortcut and since I don't have access to the same files it is throwing up an error and won't create the shortcut. Is there a way to create a shortcut without the path being validated?

    Read the article

  • Trouble getting FTP login to work in IIS6

    - by Frank Rosario
    Hello all, I'm trying to setup an FTP site for one of my clients to pickup files from us using IIS6. I've created the FTP site, have set to not isolate users (not necessary as FTP will be read only with authentication). Here's the problem. The FTP is to be password protected, so I turned of anonymous access on the FTP site. I then created a ftpuser account on the machine, and gave it read and browse directory permissions on the ftp's root directory. However, when I go to test the ftpuser login, I get a 530 "ftpuser cannot login" error. However, if I browse to same directory over HTTP (anonymous access turned off as well) and enter the ftpuser login info, I can download files and browse directories successfully. Why is the ftpuser working over HTTP but not FTP? Shouldn't I be able to login over FTP with the ftpuser login information I just created? Thanks in advance, - Frank

    Read the article

  • httpd.conf for case-insensitive file serving

    - by Anton Gogolev
    I'm a complete newbie with regard to managing Apache, so excuse me if I'm phrasing something incorrectly. I have a web site -- say, http://domain.com. The problem is that when I try to open http://domain.com/index.html in a web browser it displays the page, but when I attempt to access http://domain.com/Index.html (note capital I), it responds with HTTP 404. How do I configure Apache to serve both these files (and directories, for that matter) in a case-insensitive manner? Current httpd.conf is here. EDIT Dan C, thanks for a hint. I basically want to allow users to download files from my server and don't really want them to be aware that Index.html and index.html are in fact different. I'm also very willing to know as to what are the ramifications of this decision.

    Read the article

  • Hylafax and "No response to MPS"

    - by Joril
    We have an Hylafax 5.2.5 CentOS 5 installation hosted inside a Xen virtual machine. It works quite well, but now I'm in the process of upgrading/migrating it to a KVM virtual machine running Ubuntu 10.04 and Hylafax 5.5.1 (compiled from source using http://sourceforge.net/projects/hylafax/files/hylafax%20debian%20build%20files/ ) The problem I'm having is that - while receiving works fine - sending faxes is extremely unreliable, I get lots of "No response to MPS repeated 3 tries", or "Failure to transmit clean ECM image data." The line, modem and configuration files I'm using are the same as before, so I thought that it could be a KVM scheduling issue, but even setting cpu_shares to 10240 instead of 1024 doesn't change a thing... What else could I try? Here's an example log file http://pastebin.com/cN01cpEs

    Read the article

  • MySQL Memory usage

    - by Rob Stevenson-Leggett
    Our MySQL server seems to be using a lot of memory. I've tried looking for slow queries and queries with no index and have halved the peak CPU usage and Apache memory usage but the MySQL memory stays constantly at 2.2GB (~51% of available memory on the server). Here's the graph from Plesk. Running top in the SSH window shows the same figures. Does anyone have any ideas on why the memory usage is constant like this and not peaks and troughs with usage of the app? Here's the output of the MySQL Tuning Primer script: -- MYSQL PERFORMANCE TUNING PRIMER -- - By: Matthew Montgomery - MySQL Version 5.0.77-log x86_64 Uptime = 1 days 14 hrs 4 min 21 sec Avg. qps = 22 Total Questions = 3059456 Threads Connected = 13 Warning: Server has not been running for at least 48hrs. It may not be safe to use these recommendations To find out more information on how each of these runtime variables effects performance visit: http://dev.mysql.com/doc/refman/5.0/en/server-system-variables.html Visit http://www.mysql.com/products/enterprise/advisors.html for info about MySQL's Enterprise Monitoring and Advisory Service SLOW QUERIES The slow query log is enabled. Current long_query_time = 1 sec. You have 6 out of 3059477 that take longer than 1 sec. to complete Your long_query_time seems to be fine BINARY UPDATE LOG The binary update log is NOT enabled. You will not be able to do point in time recovery See http://dev.mysql.com/doc/refman/5.0/en/point-in-time-recovery.html WORKER THREADS Current thread_cache_size = 0 Current threads_cached = 0 Current threads_per_sec = 2 Historic threads_per_sec = 0 Threads created per/sec are overrunning threads cached You should raise thread_cache_size MAX CONNECTIONS Current max_connections = 100 Current threads_connected = 14 Historic max_used_connections = 20 The number of used connections is 20% of the configured maximum. Your max_connections variable seems to be fine. INNODB STATUS Current InnoDB index space = 6 M Current InnoDB data space = 18 M Current InnoDB buffer pool free = 0 % Current innodb_buffer_pool_size = 8 M Depending on how much space your innodb indexes take up it may be safe to increase this value to up to 2 / 3 of total system memory MEMORY USAGE Max Memory Ever Allocated : 2.07 G Configured Max Per-thread Buffers : 274 M Configured Max Global Buffers : 2.01 G Configured Max Memory Limit : 2.28 G Physical Memory : 3.84 G Max memory limit seem to be within acceptable norms KEY BUFFER Current MyISAM index space = 4 M Current key_buffer_size = 7 M Key cache miss rate is 1 : 40 Key buffer free ratio = 81 % Your key_buffer_size seems to be fine QUERY CACHE Query cache is supported but not enabled Perhaps you should set the query_cache_size SORT OPERATIONS Current sort_buffer_size = 2 M Current read_rnd_buffer_size = 256 K Sort buffer seems to be fine JOINS Current join_buffer_size = 132.00 K You have had 16 queries where a join could not use an index properly You should enable "log-queries-not-using-indexes" Then look for non indexed joins in the slow query log. If you are unable to optimize your queries you may want to increase your join_buffer_size to accommodate larger joins in one pass. Note! This script will still suggest raising the join_buffer_size when ANY joins not using indexes are found. OPEN FILES LIMIT Current open_files_limit = 1024 files The open_files_limit should typically be set to at least 2x-3x that of table_cache if you have heavy MyISAM usage. Your open_files_limit value seems to be fine TABLE CACHE Current table_cache value = 64 tables You have a total of 426 tables You have 64 open tables. Current table_cache hit rate is 1% , while 100% of your table cache is in use You should probably increase your table_cache TEMP TABLES Current max_heap_table_size = 16 M Current tmp_table_size = 32 M Of 15134 temp tables, 9% were created on disk Effective in-memory tmp_table_size is limited to max_heap_table_size. Created disk tmp tables ratio seems fine TABLE SCANS Current read_buffer_size = 128 K Current table scan ratio = 2915 : 1 read_buffer_size seems to be fine TABLE LOCKING Current Lock Wait ratio = 1 : 142213 Your table locking seems to be fine The app is a facebook game with about 50-100 concurrent users. Thanks, Rob

    Read the article

  • Maintenance Plan Reporting - Append To File - Clean Up?

    - by Adam J.R. Erickson
    Background: (SQL Server 2005, Standard Ed.) I have a maintenance plan running backups, taking a full backup 1/day, and t-log every 15 minutes. I have it set to create a text file report of each run, but that creates A LOT of files on the file server. These are hard to sort through, which makes them less useful. Question: There is an option in "Reporting and Logging" settings for appending all logs together, but how do you clean this out? If you're appending to the same log file every time, how should you make sure this file doesn't grow indefinitely? Is there a build-in function to clean out portions of appended logs like there is for cleaning out individual old log files?

    Read the article

  • X:\ is not accessible. Insufficient system resources exist to complete the requested service. Help [

    - by Katherine
    I keeping getting the error message from above on multiple computers that I administer. I wasn't sure if I should be posting this on SuperUser or ServerFault so my apologizes if it should go there... Basically, I have at least 5 computers of varying ages (some fresh out of the box!) throwing the above error. X:\ is one of our network drives that is mapped for users. Most of the time if you shut down the biggest application it will fix the problem, but it's becoming an increasing issue, and I can't keep running around fixing it manually. I have tried to do some research, but most of it just states the obvious without supplying a permanent fix. The machines are all running Win XP SP3, with at least 2gb of ram. Sorry for the delay in getting back to people... a lot of good questions. To respond back to people... It is a windows 2003 server that houses the file share. We have about 175 users, however i cannot state how many are actually accessing the information at a single moment. Considering that this is our largest file share, I would say that probably at least 100+. The files we work with are large, but not that big considering that we do a lot of graphical and video work. ~50mb. That being said, this is error occurs simply when trying to gain access to the server itself, not actual files. When I say close a program, I mean that it can be any program. It doesn't matter which program. It varies from machine to machine, and from day to day. Some days it is Firefox, some days it is Outlook, some days it is Excel. There doesn't seem to be a common bond behind which application could be causing the problem. Thank you for the articles, and the recommendation on paging files. I will have to look into that. None of our computers are set to hibernate, so I am going to rule that out.

    Read the article

  • How to dump remote database without mysqldump?

    - by deceze
    I want to dump the database on my remotely hosted site in regular intervals using a shell script. Unfortunately the server is locked down pretty tight, has no mysqldump installed, binary files can't be executed by normal users/in home directories (so I can't "install" it myself) and the database lives on a separate server, so I can't grab the files directly. The only thing I can do is log into the webserver via SSH and establish a connection to the database server using the mysql command line client. How can I dump the contents to a file a la mysqldump in SQL format? Bonus: If possible, how can I dump the contents directly to my end of the SSH connection?

    Read the article

  • How to configure custom error page in Plesk 9.3 for non existing folder?

    - by Junior Mayhé
    I'm trying to configure Plesk in order to show website visitors a custom error html. The current hosted site is an ASP.NET site. This site shows its custom errors on error403.aspx and error404.aspx files. Now to comply with plesk, I've created error_docs with required files like forbidden.html, etc... When user try to navigate http://mysite.com/a_missing_page.aspx, the visitor is redirected to error404.aspx correctly. But when user try to navigate to a non existent directory http://mysite.com/a_missing_folder/ the site takes me to IIS 404 regular page. Plesk has Custom error documents activated on Web hosting settings. ASP.NET Error pages defined in web.config are showing fine. But it seems plesk wont show its custom html error documents. The bottom line here is about setting up a custom error page to a directory. Is it possible to do this using Plesk or do I have to change it manually on IIS?

    Read the article

  • How to Shrink/Reset File stderr1 on a SAP System?

    - by Techboy
    I have a file called stderr1 in the work directory of several of the SAP servers in my production cluster. It has grown to around 19GB's to fill the hard disk on each server. I have deleted all trace files and WP files from within transaction SM50 but that hasn't deleted it (or re-named it to .old). If I try to rename or delete it manually, it says I can't because the file is in use. Please can you tell me how I can delete or shrink the stderr1 file?

    Read the article

  • Skydrive does not synchronize with one device but synchronizes successfully with another one.

    - by Hobbes
    I have an Outlook id. I also have Skydrive installed on my personal laptop (Windows 7 x64). The folders and files synchronize successfully. Today, I installed Skydrive on my office PC and logged in with the same id as above but it does not synchronize any of my folders or files, other than the default ones. When I view the logs (filename: SkyDrive.exe.reg.2012-08-09-150239.654.log), I see the following entry. 09-17-12,13:31:35.075,45a,146c,0,PAL,systeminformationhelper.cpp(661),0,0018E4F8,CRIT,The registry key to block Remote Access is not found.,System Error Code=0x2 Any idea as to what could be the problem?

    Read the article

  • Tool to maintain/keep track of filesystem content integrity?

    - by Jesse
    I'm looking for a tool to maintain the integrity of a filesystem and it's contents using checksums. Effectively storing a list of checksums/filename pairs somewhere on the filesystem in a way that can be verified later if files are somehow damaged or lost. Git does what I want, but because it stores the contents of every file in it's object database, the disk usage will at least double. And the fact that it does not provide a progress bar when scanning files tells me it was not designed for the multi-terabyte filesystem I have in mind. I can do this crudely by storing the output of md5deep, but is there a tool specifically designed for this purpose, using whatever smarts possible to make the process efficient?

    Read the article

  • Administrator File Modification Privilege

    - by Leigh Riffel
    Windows Server 2008 apparently allows an application to somehow configure the folder so that any changes made within the folder require administrator level access. I login with an account that has administrator privileges, but is not the local administrator account. When I do so I find that I can't save changes to files opened within this folder. I know I can open the application as administrator or move the file out of the folder, make the change, then move it back in, but I'm hoping there is a better way short of disabling the protection entirely. Is there a way perhaps to remove it for the files I frequently edit?

    Read the article

  • hdd fail, allows format, allows copy

    - by Bogdan
    Hello I have a problem with a Fujitsu laptop. Some kid played with it and now the hdd is a wreck. I can't install Windows XP, Windows 7 or Linux. I checked with hiren's boot for bad sector did a chdisk on it says I don't have any bad sectors, on smart says is active but status error. I tried format and it worked, I tried copying files using a live CD and it worked, but when I try to install the OS it says it can't format, or it can't copy files.

    Read the article

  • Add dpkg .symbols or.shlibs to package made using checkinstall

    - by Hubert Kario
    I have created a simple package using checkinstall of the Oracle Instantclient client libraries, the package installs without problem and is seen in the system. Problem is, that checkinstall doesn't create /var/lib/dpkg/info/oracle-instantclient11.2-basic.symbols or /var/lib/dpkg/info/oracle-instantclient11.2-basic.shlibs files so when I try to create another package (with proper build scripts) that depends on oracle-instantclient11.2-basic the build fails with dpkg-shlibdeps: error: no dependency information found for \ /usr/lib/libclntsh.so.11.1 (used by \ debian/libopendbx1-oracle/usr/lib/opendbx/liboraclebackend.so.1.2.0). dh_shlibdeps: dpkg-shlibdeps \ -Tdebian/libopendbx1-oracle.substvars \ debian/libopendbx1-oracle/usr/lib/opendbx/liboraclebackend.so.1.2.0 \ returned exit code 2 make: *** [binary-arch] Error 9 Is there an easy way to automatically create a package with .symbols or .shlibs files?

    Read the article

< Previous Page | 616 617 618 619 620 621 622 623 624 625 626 627  | Next Page >