Search Results

Search found 39784 results on 1592 pages for 'ignore files'.

Page 106/1592 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • MySQL open files limit

    - by Brian
    This question is similar to set open_files_limit, but there was no good answer. I need to increase my table_open_cache, but first I need to increase the open_files_limit. I set the option in /etc/mysql/my.cnf: open-files-limit = 8192 This worked fine in my previous install (Ubuntu 8.04), but now in Ubuntu 10.04, when I start the server up, open_files_limit is reported to be 1710. That seems like a pretty random number for the limit to be clipped to. Anyway, I tried getting around it by adding a line like this in /etc/security/limits.conf: mysql hard nofile 8192 I also tried adding this to the pre-start script in mysql's upstart config (/etc/init/mysql.conf): ulimit -n 8192 Obviously neither of those things worked. So where is the hoop that has been added between Ubuntu 8.04 and 10.04 through which I must jump in order to actually increase the open files limit?

    Read the article

  • Suggest methods for testing changes to "pam.d/common-*" files

    - by Jamie
    How do I test the changes to the pam.d configuration files: Do I need to restart the PAM service to test the changes? Should I go through every service listed in the /etc/pam.d/ directory? I'm about to make changes to the pam.d/common-* files in an effort to put an Ubuntu box into an active directory controlled network. I'm just learning what to do, so I'm preparing the configuration in a VM, which I plan to deploy in metal in the coming week. It is a clean install of Ubuntu 10.04 Beta 2 server, so other than SSH daemon, all other services are stock.

    Read the article

  • How to compile all source files (default make target does not compile all of them)

    - by Piotr Krukowiecki
    Hi, when I compile android (http://source.android.com/download) it does not compile some source files. For example there is external/bluetooth/bluez/sbc/sbc.c which is not compiled. There are also other such files. It's possible those files need not to be compiled. Or it might be that I need some special configuration to compile them. Either way, if it is possible, I'd like to compile them. Is there some way to do it? Maybe some "compile_all" make target? (I believe the reason why I want to compile all source files is not important)

    Read the article

  • Changing location of ClamAV logging files

    - by GrumpyCanuck
    I've run into a weird problem with ClamAV that I have been unable to resolve, due to a incredibly non-informative error message. I've installed ClamAV via aptitude on an Ubuntu box (ClamAV 0.96.5/13202 according to the system) up on EC2 and it is 100% stock. We have an additional drive mounted under /mnt where we put all our log files. When I start it up with the log files in the default location, it runs just fine. However, if I change the configuration file from /var/log/clamav/clamav.log to /mnt/clamav/clamav.log I get the error ERROR: Can't open /mnt/clamav/clamav.log in append mode (check permissions!). ERROR: Can't initialize the internal logger It's the same file with the same permissions on it, just in a different location. Any thoughts or tips on how to resolve this problem would be greatly appreciated.

    Read the article

  • Distributing Files using a Group Policy on Windows Server 2003

    - by tonedeath
    A piece of software that we use at our office has recently moved to a new licensing system. This means that from now on a new set of license key files will need to be distributed to each of our 25 client installations every year. All of the clients run XP and are part of an AD domain controlled by a Windows 2003 DC. I'm already using group policies to deploy software updates. I gather that this is possible with Group Policy Preferences in Server 2008. I'm just looking for a good method using Server 2003. The same set of files need copying to each client. I also have them hosted on a network share accessible by each client. I'm more of a *nix person, so I'm not particularly up on scripting in a Windows environment.

    Read the article

  • EC2 Filesystem / Files stored on the wrong partiton after launching new instance from AMI

    - by Philip Isaacs
    Today I set up a new EC2 Instance from and AMI I created from an older EC2 instance. When I launched the new instance I took the AMI that was on a small instance and launched with a medium instance. From what I can tell this is pretty standard stuff. But here's the stang part. According to AWS these are the differences Small Instance (Default) 1.7 GB of memory, 1 EC2 Compute Unit (1 virtual core with 1 EC2 Compute Unit), 160 GB of local instance storage, 32-bit or 64-bit platform Medium Instance 3.75 GB of memory, 2 EC2 Compute Units (1 virtual core with 2 EC2 Compute Units each), 410 GB of local instance storage, 32-bit or 64-bit platform Okay now here's where I'm having an issue. I when I log into the new bigger instance it still reports only having 1.7 GB of ram. The other strange part is that all my old partitions are still their in the same configurations. I see a new larger partition /mnt which is essential empty. Filesystem Size Used Avail Use% Mounted on /dev/sda1 7.9G 5.9G 1.6G 79% / none 846M 120K 846M 1% /dev none 879M 0 879M 0% /dev/shm none 879M 76K 878M 1% /var/run none 879M 0 879M 0% /var/lock none 879M 0 879M 0% /lib/init/rw /dev/sda2 335G 195M 318G 1% /mnt /dev/sdf 16G 9.9G 5.1G 67% /var2 This EC2 is a web server and I was serving files off the /var2 directory but for some reason the instance is storing everything on / Okay here's what I'd like to do. Move all my website files to /mnt and have the web server point to that. Any suggestions? If it helps here is what my fstab looks like as well. root@myserver:/var# mount -l /dev/sda1 on / type ext3 (rw) [cloudimg-rootfs] proc on /proc type proc (rw,noexec,nosuid,nodev) none on /sys type sysfs (rw,noexec,nosuid,nodev) none on /sys/kernel/debug type debugfs (rw) none on /sys/kernel/security type securityfs (rw) none on /dev type devtmpfs (rw,mode=0755) none on /dev/pts type devpts (rw,noexec,nosuid,gid=5,mode=0620) none on /dev/shm type tmpfs (rw,nosuid,nodev) none on /var/run type tmpfs (rw,nosuid,mode=0755) none on /var/lock type tmpfs (rw,noexec,nosuid,nodev) none on /lib/init/rw type tmpfs (rw,nosuid,mode=0755) /dev/sda2 on /mnt type ext3 (rw) /dev/sdf on /var2 type ext4 (rw,noatime) I hope this question makes sense. Basically i want my old files on this new partition. Thanks in advance

    Read the article

  • Search files for text matching format of a Unix directory

    - by BrandonKowalski
    I am attempting to search through all the files in a directory for text matching the pattern of any arbitrary directory. The output of this I hope to use to make a list of all directories referenced in the files (this part I think I can figure out on my own). I have looked at various regex resources and made my own expression that seems to work in the browser based tool but not with grep in the command line. /\w+[(/\w+)]+ My understanding so far is the above expression will look for the beginning / of a directory then look for an indeterminate number of characters before looking for a repeating block of the same thing. Any guidance would be greatly appreciated.

    Read the article

  • JBoss AS: use .xml files in the properties-service.xml

    - by fgysin
    The properties service (configured in properties-service.xml) in JBoss application server lets you specify external .properties files that are loaded and can then be accessed as system properties from the deployed applications. (See here http://community.jboss.org/wiki/PropertiesService for more info...) Is it also possible to load config files in the .xml format instead of .properties? I know it is possible for certain given configs like for example the mail-service.xml and the jboss-log4j.xml... But they are both loaded directly by JBoss, and not via the properties service.

    Read the article

  • WCF SSL secure transfer or large payloads without changing firewall.

    - by Sir Mix
    I need to transfer small amounts of data intermittently from clients to our server in a secure fashion and pull down large binary files from the server ocassionally. It's important for all this to be reliable. I'm anticipating 100,000 clients. I control both ends, but I want to deliver a solution that doesn't require changing the firewall for the majority of customers. A lag of one or two minutes before the information migrates to the server or comes down seems to be acceptable at this time. We need to make the connection secure, so was thinking about SSL, but open to suggestions. Basically, what is the best binding to use in this situation so that we have a secure transmission and the system handles the stress and load in a way that works for 95% of clients out of the box (firewalls will not block in majority of firewall configurations).

    Read the article

  • Copying files locally on a network folder

    - by Altay_H
    I noticed that copying a large file from one location on a network drive to another location on the same network drive takes much longer than copying it locally. Instead of copying the file locally, the network computer sends the file to my remote computer, which sends it back to the same network computer. This means the files are being transferred over the network completely unnecessarily. Is there a way to fix this issue? It's becoming a real hassle to manage the video files on my network drive. Note: This is the case with both Windows and Linux (using Samba) network folders.

    Read the article

  • Splitting/Merging 3gp files using a java library

    - by user141146
    Hi, I'm wondering if there's a java library out there that can manipulate 3gp files. Mostly I'm interested in splitting or merging existing video files. I've looked at JMF (java media framework), but it doesn't support 3gp…and FFmpeg looks promising, but it's not clear that the library allows splitting/merging of existing files. Does such a library exist?

    Read the article

  • ajax : multiple ajax calls for included js files inside jsp fragment

    - by Nrj
    I am including a jsp fragment by making an ajax call. Now this jsp frag happened to include several js files. When the ajax request is completed, it is loading each of the included js files (on fragments) using a separate get request. (I checked this using firebug.) Now is this the correct behavior (making separate get calls) or am I missing something. Is there a way to include the js files and send the response in one go ?

    Read the article

  • NSIS takes ownership of IIS system files

    - by Lucas
    I recently encountered an issue with NSIS that I believe is related to an interaction with UAC, but I am at a loss to explain it and I do not know how to prevent it in the future. I have an installer that creates and removes IIS virtual directories using the NsisIIS plugin. The installer appeared worked correctly on my Windows 7 workstation. When the installer was run on a Windows 2008 R2 server it installed properly, but the uninstaller removed all of the virtual directories and put IIS is an unusable state; to the point that I had to remove the Default Web Site and re-add it. What I eventually found was that all of the IIS configuration files under C:\Windows\System32\inetsrv\config had a lock icon on them. Some investigation seem to indicate that this means a user account has taken ownership of the file, however all the files listed SYSTEM as the file owner. I did check a different server that I have not run the installer on, and it does not have the lock icon applied to the IIS files. I have also seen the same lock icon appear on other files that the NSIS installer creates. For instance, I have a Web.Config.tpl file that is processed using the NSIS ReplaceInFile which also appears with the lock icon after the installer finished. After I explicitly grant another user account access to the file, the lock icon goes away. I run the installer under the local Administrator account on the 2008 R2 server, so I do not get the UAC prompt. Here is the relevant code from the install.nsi file RequestExecutionLevel admin Section "Application" APP_SECTION SectionIn RO Call InstallApp SectionEnd Section "un.Uninstaller Section" Delete "$PROGRAMFILES\${PROGRAMFILESDIR}\Uninstall.exe" Call un.InstallApp SectionEnd Function InstallApp File /oname=Web.Config Web.Config.tpl !insertmacro ReplaceInFile Web.Config %CONNECTION_STRING% $CONNECTION_STRING FunctionEnd Function un.InstallApp ReadRegStr $0 HKLM "Software\${REGKEY}" "VirtualDir" NsisIIS::DeleteVDir "$0" Pop $0 FunctionEnd I have three questions stemming from this incident: How did this happen? How can I fix my installer to prevent it from happening again? How can I repair the permissions on the IIS config files.

    Read the article

  • Getting Classic ASP to work in .js files under IIS 7

    - by Abdullah Ahmed
    I am moving a clients classic asp webapp to a new IIS7 based server. The site contains some .js files which have javascript but also classic asp in <% % tags which contains a bunch of conditional statements designed to spit out pieces of javascript based on session state variables. Here's a brief example of what the file could be like.... var arrHOFFSET = -1; var arrLeft ="<"; var arrRight = ">"; <% If ((Session("dashInv") = "True") And ((Session("systemLevelStaff") = "4") Or (Session("systemLevelCompany") = "4"))) Then %> addMainItem("/MgmtTools/WelcomeInventory.asp?wherefrom=salesMan","",81,"center","","",0,0,"","","","",""); <% Else %> <% If (Session("dashInv") = "False") And ((Session("systemLevelStaff") = "4") Or (Session("systemLevelCompany") = "4")) Then %> <% Else %> addMainItem("/calendar/welcome.asp","",81,"center","","",0,0,"","","","",""); <% End If %> <% End If %> defineSubmenuProperties(135,"center","center",-3,0,"","","","","","",""); Currently this file (named custom.js for example) will start throwing js errors, because the server doesnt seem to recognize the asp code in it and therefore does not parse it. I know I need to somehow specify that a .js file should also be treated like an .asp file and run through parsing it. However I am not sure how to go about doing this. Here is what I've tried so far... Under the Server node in IIS under HANDLER MAPPINGS I created a new Script Map with the following settings. Request Path: *.js Executable: C:\Windows\System32\inetsrv\asp.dll Name: ASPClassicInJSFiles Mapping: Invoke Handler only if request is mapped to : File Verbs: All verbs Access: Script I also created a similar handler under the site node itself. Under MIME Types .js is defined as application/x-javascript None of these work. If I simply rename the file to have .asp extension then things work, however this app is poorly coded and has literally 100's of files with the .js files included in them under various names and locations, so rename, search and replace is the last option I have.

    Read the article

  • .Net Library for parsing source code files?

    - by Jörg Battermann
    Does anyone know of a good .NET library that allows me to parse source code files, but not only .NET source code files (like java, perl, ruby, etc)? I need programmatic access to the contents of various source code files (e.g. class/method /parameter names, types, etc.). Has anyone come across something like this? I know within .NET it is reasonably possible and there are some libraries out there, but I need that to be abstracted to more types of programming languages.

    Read the article

  • PHP: Class to parse OGG and .ogv files?

    - by Nic Hubbard
    I am looking for a php class that can parse ogg and .ogv files so that I can get some of the metadata out of the files, such as comments, bitrate, length, etc. I have found this: http://opensource.grisambre.net/ogg/ but after testing it, it does not seem to parse and of the files that I test it with. Has anyone had luck with an alternative? I would use getID3(), but it does not support ogg video.

    Read the article

  • Time Machine (OSX) doesn't back up files in Mount Point or Disk Image File

    - by Chris
    Hi all, I found this Q&A (http://superuser.com/questions/148849/backup-mounted-drive-of-an-image-in-time-machine) and this prompted me to ask the following question: I have two disk images which are scripted to be mounted on login. These two disk images are always mounted to the same location. These two disk images are encrypted TrueCrypt volumes. Time Machine (TM) will only back up the disk images the first time they are mounted, but not after that. As I modify documents within the volumes throughout the day, the modified timestamps are adjusted properly. However, TM does not back them up. TM never backs up the mount points which are two folders within my home directory. Any ideas as to why neither the mount point or the image files are backed up? Do the image files have to be closed (unmounted) after being modified for TM to back them up? Thanks, Chris

    Read the article

  • LAMP stack security question - uploading files to server

    - by morpheous
    I am running Ubuntu 9.10 desktop on my home machine. I need to upload files from my local machine, to my web server, on a periodic basis. My server is running Ubuntu Server LTS. I want my server to be secure, and only run the LAMP stack and possibly, an email server. I do not (ideally) want to have FTP or anything that can allow (more) knowledgeable hackers to be able to hack into my server. Can anyone recommend how I may send files from my local machine to the server? This may seem an easy/trivial question, but I am relatively new to Linux - and I got my previous Windows server machine serious hacked in the past, hence the move to Linux, and thats why I am so security conscious.

    Read the article

  • Reset Mac OS X (Snow Leopard) File Permissions -- All Files

    - by Frank
    Is their a script or process completely reset all file system file permissions to factory default? (Less restoring from a image backup or reinstalling the OS). This would include I've affected all files from / to Applications and home folder and all contents. (Everything) I've tried to use the Disk Utility's First Aid 'Repair Disk Permissions' but it didn't seem to touch or affect everything - some but not all. I've ran it twice so far... I've seen this but it's not quite the something. Fixing mac user file permissions, not the system The reason for all of this is I accidentally ran a chmod on all files (as sudo). Working too fast, now I'm in a hole.

    Read the article

  • Nginx .zip files return 404

    - by Kenley Tomlin
    I have set up Nginx as a reverse proxy for Node and to serve my static files and user uploaded images. Everything is working beautifully except that I can't understand why Nginx can't find my .zip files. Here is my nginx.conf. user nginx; worker_processes 1; error_log /var/log/nginx/error.log warn; pid /var/run/nginx.pid; events { worker_connections 1024; } http { include mime.types; proxy_cache_path /var/www/web_cache levels=1:2 keys_zone=ooparoopaweb_cache:8m max_size=1000m inactive=600m; sendfile on; upstream *******_node { server 172.27.198.66:8888 max_fails=3 fail_timeout=20s; #fair weight_mode=idle no_rr } upstream ******_json_node { server 172.27.176.57:3300 max_fails=3 fail_timeout=20s; } server { #REDIRECT ALL HTTP REQUESTS FOR FRONT-END SITE TO HTTPS listen 80; server_name *******.com www.******.com; return 301 https://$host$request_uri; } server { #MOBILE APPLICATION PROXY TO NODE JSON listen 3300 ssl; ssl_certificate /*****/*******/json_ssl/server.crt; ssl_certificate_key /*****/******/json_ssl/server.key; server_name json.*******.com; location / { proxy_pass http://******_json_node; proxy_redirect off; proxy_set_header Host $host ; proxy_set_header X-Real-IP $remote_addr ; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for ; proxy_set_header X-Forwarded-Proto https; client_max_body_size 20m; client_body_buffer_size 128k; proxy_connect_timeout 90s; proxy_send_timeout 90s; proxy_read_timeout 90s; proxy_buffers 32 4k; } } server { #******.COM FRONT-END SITE PROXY TO NODE WEB SERVER listen 443 ssl; ssl_certificate /***/***/web_ssl/********.crt; ssl_certificate_key /****/*****/web_ssl/myserver.key; server_name mydomain.com www.mydomain.com; add_header Strict-Transport-Security max-age=500; location / { gzip on; gzip_types text/html text/css application/json application/x-javascript; proxy_pass http://mydomain_node; proxy_redirect off; proxy_set_header Host $host ; proxy_set_header X-Real-IP $remote_addr ; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for ; proxy_set_header X-Forwarded-Proto https; client_max_body_size 20m; client_body_buffer_size 128k; proxy_connect_timeout 90s; proxy_send_timeout 90s; proxy_read_timeout 90s; proxy_buffers 32 4k; } } server { #ADMIN SITE PROXY TO NODE BACK-END listen 80; server_name admin.mydomain.com; location / { proxy_pass http://mydomain_node; proxy_redirect off; proxy_set_header Host $host ; proxy_set_header X-Real-IP $remote_addr ; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for ; client_max_body_size 20m; client_body_buffer_size 128k; proxy_connect_timeout 90s; proxy_send_timeout 90s; proxy_read_timeout 90s; proxy_buffers 32 4k; } } server { # SERVES STATIC FILES listen 80; listen 443 ssl; ssl_certificate /**/*****/server.crt; ssl_certificate_key /****/******/server.key; server_name static.domain.com; access_log static.domain.access.log; root /var/www/mystatic/; location ~*\.(jpeg|jpg|png|ico)$ { gzip on; gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/rss+xml text/javascript image/svg+xml application/vnd.ms-fontobject application/x-font-ttf font/opentype image/png image/jpeg application/zip; expires 10d; add_header Cache-Control public; } location ~*\.zip { #internal; add_header Content-Type "application/zip"; add_header Content-Disposition "attachment; filename=gamezip.zip"; } } } include tcp.conf; Tcp.conf contains settings that allow Nginx to proxy websockets. I don't believe anything contained within it is relevant to this question. I also want to add that I want the zip files to be a forced download.

    Read the article

  • Best Practices for working with files via c#

    - by user177883
    Application I work on generates several hundreds of files in a 15 minutes period of times. and the back end of the application takes these files and process them (updates database with those values). One problem is database locks. What are the best practices on working with several thousands of files to avoid locking and efficiently processing these files? Would it be more efficient to create a single file and process it? or process single file at a time? What are so common best practices?

    Read the article

  • Tracking unique versions of files with hashes

    - by rwmnau
    I'm going to be tracking different versions of potentially millions of different files, and my intent is to hash them to determine I've already seen that particular version of the file. Currently, I'm only using MD5 (the product is still in development, so it's never dealt with millions of files yet), which is clearly not long enough to avoid collisions. However, here's my question - Am I more likely to avoid collisions if I hash the file using two different methods and store both hashes (say, SHA1 and MD5), or if I pick a single, longer hash (like SHA256) and rely on that alone? I know option 1 has 288 hash bits and option 2 has only 256, but assume my two choices are the same total hash length. Since I'm dealing with potentially millions of files (and multiple versions of those files over time), I'd like to do what I can to avoid collisions. However, CPU time isn't (completely) free, so I'm interested in how the community feels about the tradeoff - is adding more bits to my hash proportionally more expensive to compute, and are there any advantages to multiple different hashes as opposed to a single, longer hash, given an equal number of bits in both solutions?

    Read the article

  • Need help manipulating WAV (RIFF) Files at a byte level

    - by Eric
    I'm writing an an application in C# that will record audio files (*.wav) and automatically tag and name them. Wave files are RIFF files (like AVI) which can contain meta data chunks in addition to the waveform data chunks. So now I'm trying to figure out how to read and write the RIFF meta data to and from recorded wave files. I'm using NAudio for recording the files, and asked on their forums as well on SO for way to read and write RIFF tags. While I received a number of good answers, none of the solutions allowed for reading and writing RIFF chunks as easily as I would like. But more importantly I have very little experience dealing with files at a byte level, and think this could be a good opportunity to learn. So now I want to try writing my own class(es) that can read in a RIFF file and allow meta data to be read, and written from the file. I've used streams in C#, but always with the entire stream at once. So now I'm little lost that I have to consider a file byte by byte. Specifically how would I go about removing or inserting bytes to and from the middle of a file? I've tried reading a file through a FileStream into a byte array (byte[]) as shown in the code below. System.IO.FileStream waveFileStream = System.IO.File.OpenRead(@"C:\sound.wav"); byte[] waveBytes = new byte[waveFileStream.Length]; waveFileStream.Read(waveBytes, 0, waveBytes.Length); And I could see through the Visual Studio debugger that the first four byte are the RIFF header of the file. But arrays are a pain to deal with when performing actions that change their size like inserting or removing values. So I was thinking I could then to the byte[] into a List like this. List<byte> list = waveBytes.ToList<byte>(); Which would make any manipulation of the file byte by byte a whole lot easier, but I'm worried I might be missing something like a class in the System.IO name-space that would make all this even easier. Am I on the right track, or is there a better way to do this? I should also mention that I'm not hugely concerned with performance, and would prefer not to deal with pointers or unsafe code blocks like this guy. If it helps at all here is a good article on the RIFF/WAV file format.

    Read the article

  • Powershell: Read value from xml files

    - by DanielR
    I need some with help with PowerShell, please. It should be pretty easy: I have a list of subdirectories, with a xml file in each one. I want to open each xml file and print the value of one node. The node is always the same, as the xml files are actually project files (*.csproj) from Visual Studio. I already got the list of files: get-item ** \ *.csproj How do I proceed?

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >