Search Results

Search found 26517 results on 1061 pages for 'large directory'.

Page 51/1061 | < Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >

  • Loading the 'pktgen' module on Ubuntu Server

    - by StackedCrooked
    I would like to enable and use the pktgen module on Ubuntu Server. I have enabed the module by adding a line containing 'pktgen' to the /etc/modules file. After rebooting it seems that the module is successfully loaded because the directory /proc/net/pktgen exists. However when trying to run the first sample I get these errors: root@ubuntu:~# bash ./pktgen.conf-1-1 Removing all devices Adding eth4 Setting max_before_softirq 10000 Configuring /proc/net/pktgen/eth4 ./pktgen.conf-1-1: line 9: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory ./pktgen.conf-1-1: line 9: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory ./pktgen.conf-1-1: line 9: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory ./pktgen.conf-1-1: line 9: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory ./pktgen.conf-1-1: line 9: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory ./pktgen.conf-1-1: line 9: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory cat: /proc/net/pktgen/eth4: No such file or directory Running... ctrl^C to stop Done It turns out the script simply unable to write a file to the /proc/net/pktgen directory. When I try this manually it fails as well: root@ubuntu:~# cd /proc/net/pktgen/ root@ubuntu:/proc/net/pktgen# touch eth4 touch: cannot touch `eth4': No such file or directory Can anyone help me make it work? I'm using Ubuntu version: 2.6.32-21-server. Fixed I apologize for keeping this post not up to date. I was able to fix it. If I remember well the cause of the error was that eth4 did not exist, or did not have the 'online' status. Anyway, it is fixed now.

    Read the article

  • Prefix files with the current directory name using Powershell

    - by XST
    I have folders with images (*.png and *.jpg) >C:\Directory\Folder1 01.png 02.png 03.jpg 04.jpg 05.png And I want to rename all the files like this using powershell: >C:\Directory\Folder1 Folder1 - 01.png Folder1 - 02.png Folder1 - 03.jpg Folder1 - 04.jpg Folder1 - 05.png So I came up with this simple line: Get-ChildItem | Where-Object { $_.Extension -eq ".jpg" -or $_.Extension -eq ".png"} | rename-item -newname {$_.Directory.Name +" - " + $_.Name} If I have 35 or less files in the folder, I will have the wanted result, but if there is 36 or more files, I will end up with this: >C:\Directory\Folder1 Folder1Folder1Folder1 - 01.png Folder1Folder1Folder1 - 02.png Folder1Folder1Folder1 - 03.jpg Folder1Folder1Folder1 - 04.jpg Folder1Folder1Folder1 - 05.png The loop stops when the file's name exceeds 248 characters. Any ideas why it's looping?

    Read the article

  • SFTP: file symlinks in a jailed (chrooted) directory

    - by Kevin Duke
    I'm trying to set up sftp so that a few trusted people can access/edit/create some files. I have jailed a user into their home directory (/home/name) but have run into a problem. I want for them to also be able to access other parts of the VPS because it is also a game server, webhost, etc, and I want for them to be able to have full control of files outside their jailed directory. I tried making a symlink (ln -s) to the desired directory but it does not work, as expected. I tried (cp -rl) to the files that I wanted to give access and it worked -- they can edit the files in their directory and it changes the one stored outside of jail. BUT they cannot create new files (they can but it won't update outside of jail). I know I'm probably not doing this the "right way" but what can I do to do what I want?

    Read the article

  • proftpd: copying uploaded files to an additional directory

    - by Matthew Iselin
    Using proftpd, is there a good way to automatically synchronise uploaded files from the upload directory to some other directory? Our layout ends up a bit like this: ~/ftp/some/path <-- Files are uploaded here ~/some/other/path/not/accessible/via/ftp <-- But also need to be here after uploading Is there a good way to do this automatically, or do I have to tell anyone uploading files to upload twice, and open up an additional directory (containing data we cannot redistribute)?

    Read the article

  • Ubuntu Deluge shows errors when downloading large bit torrent files and keeps erroring out after try

    - by MikeN
    Ubuntu Deluge shows errors when downloading large bit torrent files and keeps erroring out after trying to resume. The error on details shows: "Invalid argument". This happens for many large torrents that are running for several days (trying to download.) I try to "resume" and "force rechceck" but it never works. Smaller torrents seem to work ok. What is causing these torrents to never complete? Is there a way to force Deluge to keep auto-resuming every few minutes after a failure instead of just giving up?

    Read the article

  • Ubuntu 12.04 Samba File Server timeout on large file

    - by phileaton
    I am a beginner with servers. I checked the error logs for Samba and it appears that Samba is timing out when I transfer large files. I can successfully add pdfs for instance to my file server. However, I tried to add a large 1.2gb video file and it did not succeed. This is the error in the log: smbd/process.c:244(read_packet_remainder) read_fd_with_timeout failed for client 0.0.0.0 read error = NT_STATUS_CONNECT$ Is there a way I can stop it from timing out? Any pointers would be great! Thanks!

    Read the article

  • Unable to install bin files.(No such file or directory error)

    - by rogerstone
    I wanted to install adobe reader on my ubuntu 10.10(Maverick Meerkat).I have downloaded the file and copied it on my desktop.I had then browsed to the desktop directory through command line terminal. I had tried all the possible combinations of the commands but still i get a "file or directory does not exist error" roger@ubuntu:~/Desktop$ chmod a+x AdbeRdr9.4-1_i486linux_enu.bin roger@ubuntu:~/Desktop$ sudo ./AdbeRdr9.4-1_i486linux_enu.bin sudo: unable to execute ./AdbeRdr9.4-1_i486linux_enu.bin: No such file or directory I tried without the sudo and this is what i get roger@ubuntu:~/Desktop$ chmod a+x AdbeRdr9.4-1_i486linux_enu.bin roger@ubuntu:~/Desktop$ ./AdbeRdr9.4-1_i486linux_enu.bin bash: ./AdbeRdr9.4-1_i486linux_enu.bin: No such file or directory The file is present in the desktop.Where am i going wrong? P.S:This is not a duplicate of the question Cannot install .bin package on Ubuntu

    Read the article

  • Correcting owner/permissions on damaged directory tree in linux

    - by mcs130
    I inadvertently made a backup copy of a directory recursively and forgot the -a (--preserve) switch when doing so. This damaged my backup directory (which contains data we need to access). The directory and all of its child folders and files comprise an installation of an application including postgress DB and solr files. The original copy was used to for a failed re-config attempt. Now I need to use the backup copy to start over, only the ownership of the backup copy is now root across everything and it is no longer usable (processes won't run due to ownership problems I created when I forgot the -a on the cp -r). I've re-installed a clean copy of the application into a 3rd location now (which has the correct owner/perms) and need to copy the owner/perms from this good directory over onto the damaged directory. What is the best way (if even possible) to do this. (I've Googled and seen things from perl scripting to setfacl/getfacl to do this but am unfortunately still confused). Apologies if this seems a dumb question. Thanks.

    Read the article

  • Setting default working directory/drive in Emacs shell on Windows

    - by Victor K.
    Hello, how can I change a default working directory/drive for shell in Emacs (on Windows)? Normally, shell starts in the same directory as the file in current buffer. However, when my current file is on D: drive, it starts in c:. Manually changing drive to D: in shell brings me to my directory of course, but I want to avoid this extra step. Is it possible?

    Read the article

  • How to simply remove everything from a directory on Linux

    - by Tometzky
    How to simply remove everything from a current or specified directory on Linux? Several approaches: rm -fr * rm -fr dirname/* Does not work — it will leave hidden files — the one's that start with a dot, and files starting with a dash in current dir, and will not work with too many files rm -fr -- * rm -fr -- dirname/* Does not work — it will leave hidden files and will not work with too many files rm -fr -- * .* rm -fr -- dirname/* dirname/.* Don't try this — it will also remove a parent directory, because ".." also starts with a "." rm -fr * .??* rm -fr dirname/* dirname/.??* Does not work — it will leave files like ".a", ".b" etc., and will not work with too many files find -mindepth 1 -maxdepth 1 -print0 | xargs -0 rm -fr find dirname -mindepth 1 -maxdepth 1 -print0 | xargs -0 rm -fr As far as I know correct but not simple. find -delete find dirname -delete AFAIK correct for current directory, but used with specified directory will delete that directory also. find -mindepth 1 -delete find dirname -mindeph 1 -delete AFAIK correct, but is it the simplest way?

    Read the article

  • Large public datasets?

    - by Jason
    I am looking for some large public datasets, in particular: Large sample web server logs that have been anonymized. Datasets used for database performance benchmarking. Any other links to large public datasets would be appreciated. I already know about Amazon's public datasets at: http://aws.amazon.com/publicdatasets/

    Read the article

  • Unable to delete a directory from NTFS drive: "Access is denied"

    - by Evgeny
    I'm running Windows XP Pro x64 SP2. I have a directory on an NTFS drive that was created by a Maven build. A subsequent build attempted to delete this directory and failed. I now get the error "Access is denied" whenever I try to do anything with that directory: change to it, delete it, rename it. This happens both in Windows Explorer and from a command prompt. The properties dialog in Windows Explorer doesn't even contain the Security tab. I created the directory, so I don't think this is truly a permissions issue. I've occasionally had this error happen in the past is well. I believe the error is misleading, but the question is: what is the real problem and how do I fix it?

    Read the article

  • What is a vim "runtime directory"?

    - by Andres Jaan Tack
    I'm trying to get started with things like FuzzyFinder, but I am stuck at the point where it says: INSTALLATION fuf-installation Put all files into your runtime directory. If you have the zip file, extract it to your runtime directory. You should place the files as follows: your_runtime_directory/plugin/fuf.vim ... What is my "runtime directory"? How do I know if I have one? Why does it matter how I put things into it?

    Read the article

  • Filezilla client unable to get directory listing from Filezilla Server (Windows)

    - by sestocker
    I've set up a self signed certificate in FileZilla server and enabled FTP over SSL/TPS. When I connect from the client FileZilla, I am able to authenticate but cannot get a directory listing: Status: Connecting to MY_SERVER_IP:21... Status: Connection established, waiting for welcome message... Response: 220-FileZilla Server version 0.9.39 beta Response: 220-written by Tim Kosse ([email protected]) Response: 220 Please visit http://sourceforge.net/projects/filezilla/ Command: AUTH TLS Response: 234 Using authentication type TLS Status: Initializing TLS... Status: Verifying certificate... Command: USER MYUSER Status: TLS/SSL connection established. Response: 331 Password required for MYUSER Command: PASS ******** Response: 230 Logged on Command: PBSZ 0 Response: 200 PBSZ=0 Command: PROT P Response: 200 Protection level set to P Status: Connected Status: Retrieving directory listing... Command: PWD Response: 257 "/" is current directory. Command: TYPE I Response: 200 Type set to I Command: PORT 10,10,25,85,219,172 Response: 200 Port command successful Command: MLSD Response: 150 Opening data channel for directory list. Response: 425 Can't open data connection. Error: Failed to retrieve directory listing I have ports 21 and 50001 through 50005 open on the firewall. We are migrating servers - the 50001 - 50005 is one of the things that helped get FTPS working on the old server. I'm not sure this installation would use the same ports? What else could be the problem?

    Read the article

  • RewriteRules targeting a directory result in a gratuitous redirect [closed]

    - by MapDot
    I have a standard CMS-like RewriteRule set up in my .htaccess: RewriteRule ^(.+)$ index.php?slug=$1 Let's say I have a directory called "foo" in the root directory. For some reason, if you hit the page it causes a redirect: http://www.mysite.com/foo -- http://www.mysite.com/foo?slug=foo Removing the directory fixes the problem, but unfortunately, it's not an option. Does anyone know of a workaround?

    Read the article

  • Quickly create large file on a windows system?

    - by Leigh Riffel
    In the same vein as http://stackoverflow.com/questions/257844/quickly-create-a-large-file-on-a-linux-system I'd like to quickly create a large file on a windows system. By large I'm thinking 5GB. The content doesn't matter. A built in command or short batch file would be preferable, but I'll accept an application if there are no other easy ways.

    Read the article

  • Check if all files in a directory exists elsewhere

    - by aioobe
    I'm about to remove an old backup directory, but before doing so I'd like to make sure that all these files exist in a newer directory. Is there a tool for this? Or am I best off doing this "manually" using find, md5sum, sorting, comparing, etc? Clarification: If I have the following directory listings /path/to/old_backup/dir1/fileA /path/to/old_backup/dir1/fileB /path/to/old_backup/dir2/fileC and /path/to/new_backup/dir1/fileA /path/to/new_backup/dir2/fileB /path/to/new_backup/dir2/fileD then fileA and fileB exists in new_backup (fileA in its original directory, and fileB has moved from dir1 to dir2). fileC on the other hand is missing in new_backup and fileD has been created. In this situation I'd like the output to be something like fileC exists in old_backup, but not in new_backup.

    Read the article

  • Create 8.3 name for an existing directory

    - by Chris Karcher
    I have a machine that initially had 8.3 filename creation disabled. However, this was causing issues with some legacy software, so it was re-enabled. I'm wondering if it's possible to go back and "add" 8.3 filenames to certain existing directories. For example, say I have a directory named "C:\name with spaces" and I get the following output when I run "dir /x": C:\>dir /x Volume in drive C has no label. Volume Serial Number is 6873-65B8 Directory of C:\ 04/09/2010 01:57 PM <DIR> name with spaces ... I'd like to somehow add an 8.3 name for the directory without recreating it, and then get the following: C:\>dir /x Volume in drive C has no label. Volume Serial Number is 6873-65B8 Directory of C:\ 04/09/2010 01:57 PM <DIR> NAMEWI~1 name with spaces ... I tried the 'rename' command but it didn't do the trick.

    Read the article

  • Unable to delete a directory from NTFS drive: "Access is deined"

    - by Evgeny
    I'm running Windows XP Pro x64 SP2. I have a directory on an NTFS drive that was created by a Maven build. A subsequent build attempted to delete this directory and failed. I now get the error "Access is denied" whenever I try to do anything with that directory: change to it, delete it, rename it. This happens both in Windows Explorer and from a command prompt. The properties dialog in Windows Explorer doesn't even contain the Security tab. I created the directory, so I don't think this is truly a permissions issue. I've occasionally had this error happen in the past is well. I believe the error is misleading, but the question is: what is the real problem and how do I fix it?

    Read the article

  • How to I serialize a large graph of .NET object into a SQL Server BLOB without creating a large bu

    - by Ian Ringrose
    We have code like: ms = New IO.MemoryStream bin = New System.Runtime.Serialization.Formatters.Binary.BinaryFormatter bin.Serialize(ms, largeGraphOfObjects) dataToSaveToDatabase = ms.ToArray() // put dataToSaveToDatabase in a Sql server BLOB But the memory steam allocates a large buffer from the large memory heap that is giving us problems. So how can we stream the data without needing enough free memory to hold the serialized objects. I am looking for a way to get a Stream from SQL server that can then be passed to bin.Serialize() so avoiding keeping all the data in my processes memory. Likewise for reading the data back... Some more background. This is part of a complex numerical processing system that processes data in near real time looking for equipment problems etc, the serialization is done to allow a restart when there is a problem with data quality from a data feed etc. (We store the data feeds and can rerun them after the operator has edited out bad values.) Therefore we serialize the object a lot more often then we de-serialize them. The objects we are serializing include very large arrays mostly of doubles as well as a lot of small “more normal” objects. We are pushing the memory limit on a 32 bit system and make the garage collector work very hard. (Effects are being made elsewhere in the system to improve this, e.g. reusing large arrays rather then create new arrays.) Often the serialization of the state is the last straw that courses an out of memory exception; our peak memory usage is while this serialization is being done. I think we get large memory pool fragmentation when we de-serialize the object, I expect there are also other problem with large memory pool fragmentation given the size of the arrays. (This has not yet been investigated, as the person that first looked at this is a numerical processing expert, not a memory management expert.) Are customers use a mix of Sql Server 2000, 2005 and 2008 and we would rather not have different code paths for each version of Sql Server if possible. We can have many active models at a time (in different process, across many machines), each model can have many saved states. Hence the saved state is stored in a database blob rather then a file. As the spread of saving the state is important, I would rather not serialize the object to a file, and then put the file in a BLOB one block at a time. Other related questions I have asked How to Stream data from/to SQL Server BLOB fields? Is there a SqlFileStream like class that works with Sql Server 2005?

    Read the article

  • Best and Proper Permissions Settings for Directory

    - by Dr. DOT
    I am interested in knowing the proper, yet security-conscious settings for a directory. Here's my scenario: I have a username for FTP access to my server called "user". For the purpose of the scenario, PHP runs as "nobody" on my server. I have a directory off the document root called "sample". The "sample" directory is chmod'd at 0755 (drwxr-xr-x) "Sample" is owned by "user" and the group is set to "user" The above is all very straight forward and standard. So I want to have a script be able to create (mkdir) and delete (rmdir) directories under "sample". Yet, I don't want to obviously overly expose my server by opening up the permissions (I could easily chmod sample to 0777 and make it world write-able). What is the best combination of permissions, owner settings and/or group settings to allow my script to create and delete directories under "sample" while retaining the ability for "user" to continue to FTP into the directory? Thanks.

    Read the article

  • Managing Linux Directory Permissions & SFTP

    - by Dizzle
    Good morning; I have a RHEL 5.7 web server configured to allow SSH/SFTP only by specific groups. I'd like for content managers to upload content to their respective directories and have that content inherit the user/group ownership of the directory regardless of upload method or application. For example: John is in group "web" for SSH/SFTP rights and "finance" for directory permissions, and uploads to directory "webstuff" via SFTP. Directory "webstuff" has permissions of "2760" (rwxrws---), and ownership of "apache:finance". If John uploads an update to an existing file in "webstuff", the ownership of the file stays at "apache:finance". If John uploads a new file to "webstuff", the ownership of the file is "john:finance". My desire is to have any file from John uploaded to "webstuff" to change to the directory's owner. I've tried with setuid and setgid both set, but the user-ownership didn't take. I've seen mentions on ServerFault of using ACL's, or a chrooted jail for SFTP but I have yet to configure and test them, and I don't know if they're a viable solution (they could be, I just don't know because I've never done either). Any thoughts and assistance would be greatly appreciated.

    Read the article

  • IIS: redirect everything to another URL, except for one Directory

    - by DrStalker
    I have an IIS server (IIS 6, Win 2003) that hosts the site http://www.foo.com. I want any request to http://foo.com (no matter what path/filename is used) to redirect to http://www.bar.org/AwesomePage.html UNLESS the request is for http://www.foo.com/specialdir, in which case the HTML files in the local directory specialdir should be used. The problem I have is once the redirect is set it also affects /specialdir - even if I right click on that directory and select "content should come from ... local directory" that change does not take effect, and the directory still shows as redirecting to http://www.bar.org/AwesomePage.html. The same thing happens if I try to set individual files to load from the local system instead of redirecting - IIS gives no error, but the change does not take effect and the files still show as being redirected. How can I set specialdir to override the redirection to the new URL?

    Read the article

  • Accessing a website's directory in IIS from File Zilla

    - by Cdeez
    I have my Asp.net website deployed in my IIS's Virtual directory. Usually a FTP software like File Zilla is used to upload files to a website's directory from a remote system. File Zilla asks for a Host name, Username, password to connect to the remote server. Now all I want is my users in LAN should be able to access this directory from their system using FTP software like FileZilla. So how can I provide the Host name, username and password to my website's directory. I tried to find it on google but no help. Detailed steps please. Its IIS 5.1 version.

    Read the article

  • Optimizing processing and management of large Java data arrays

    - by mikera
    I'm writing some pretty CPU-intensive, concurrent numerical code that will process large amounts of data stored in Java arrays (e.g. lots of double[100000]s). Some of the algorithms might run millions of times over several days so getting maximum steady-state performance is a high priority. In essence, each algorithm is a Java object that has an method API something like: public double[] runMyAlgorithm(double[] inputData); or alternatively a reference could be passed to the array to store the output data: public runMyAlgorithm(double[] inputData, double[] outputData); Given this requirement, I'm trying to determine the optimal strategy for allocating / managing array space. Frequently the algorithms will need large amounts of temporary storage space. They will also take large arrays as input and create large arrays as output. Among the options I am considering are: Always allocate new arrays as local variables whenever they are needed (e.g. new double[100000]). Probably the simplest approach, but will produce a lot of garbage. Pre-allocate temporary arrays and store them as final fields in the algorithm object - big downside would be that this would mean that only one thread could run the algorithm at any one time. Keep pre-allocated temporary arrays in ThreadLocal storage, so that a thread can use a fixed amount of temporary array space whenever it needs it. ThreadLocal would be required since multiple threads will be running the same algorithm simultaneously. Pass around lots of arrays as parameters (including the temporary arrays for the algorithm to use). Not good since it will make the algorithm API extremely ugly if the caller has to be responsible for providing temporary array space.... Allocate extremely large arrays (e.g. double[10000000]) but also provide the algorithm with offsets into the array so that different threads will use a different area of the array independently. Will obviously require some code to manage the offsets and allocation of the array ranges. Any thoughts on which approach would be best (and why)?

    Read the article

< Previous Page | 47 48 49 50 51 52 53 54 55 56 57 58  | Next Page >