Search Results

Search found 40999 results on 1640 pages for 'duplicate files'.

Page 128/1640 | < Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >

  • Move files from multiple folders all into parent directory with command prompt [win7]

    - by Nick
    I have multiple .rar files in multiple folders like this: C:\Docs\Folder1\rarfile1-1.rar C:\Docs\Folder1\rarfile1-2.rar C:\Docs\Folder1\rarfile1-3.rar C:\Docs\Folder2\rarfile2-1.rar C:\Docs\Folder2\rarfile2-2.rar C:\Docs\Folder2\rarfile2-3.rar C:\Docs\Folder3\rarfile3-1.rar C:\Docs\Folder3\rarfile3-2.rar C:\Docs\Folder3\rarfile3-3.rar I want to move all of the .rar files to the parent directory 'C:\Docs'. I have a lot more than 3 folders, so I was thinking of making a batch file or something. What would be the commands to do this? Thanks

    Read the article

  • Getting content with images into the Web Browser control without temporary files

    - by Revision17
    For a client server application, I'd like to display content with a web browser control with images without writing temporary files to the disk. I've tried using mht files via documentstream and documenttext, but the web browser control isn't smart enough to recognize mht files. I would use data URI images, however most computers this will be installed on use IE6 or 7. Are there any other options for this?

    Read the article

  • uploading zip files in codeigniter won't work

    - by krike
    I have created a helper that requires some parameters and should upload a file, the function works for images however not for zip files. I searched on google and even added a MY_upload.php - http://codeigniter.com/bug_tracker/bug/6780/ however I still have the problem so I used print_r to display the array of the uploaded files, the image is fine however the zip array is empty: Array ( [file_name] => [file_type] => [file_path] => [full_path] => [raw_name] => [orig_name] => [file_ext] => [file_size] => [is_image] => [image_width] => [image_height] => [image_type] => [image_size_str] => ) Array ( [file_name] => 2385b959279b5e3cd451fee54273512c.png [file_type] => image/png [file_path] => I:/wamp/www/e-commerce/sources/images/ [full_path] => I:/wamp/www/e-commerce/sources/images/2385b959279b5e3cd451fee54273512c.png [raw_name] => 2385b959279b5e3cd451fee54273512c [orig_name] => 1269770869_Art_Artdesigner.lv_.png [file_ext] => .png [file_size] => 15.43 [is_image] => 1 [image_width] => 113 [image_height] => 128 [image_type] => png [image_size_str] => width="113" height="128" ) this is the function helper function multiple_upload($name = 'userfile', $upload_dir = 'sources/images/', $allowed_types = 'gif|jpg|jpeg|jpe|png', $size) { $CI =& get_instance(); $config['upload_path'] = realpath($upload_dir); $config['allowed_types'] = $allowed_types; $config['max_size'] = $size; $config['overwrite'] = FALSE; $config['encrypt_name'] = TRUE; $ffiles = $CI->upload->data(); echo "<pre>"; print_r($ffiles); echo "</pre>"; $CI->upload->initialize($config); $errors = FALSE; if(!$CI->upload->do_upload($name))://I believe this is causing the problem but I'm new to codeigniter so no idea where to look for errors $errors = TRUE; else: // Build a file array from all uploaded files $files = $CI->upload->data(); endif; // There was errors, we have to delete the uploaded files if($errors): @unlink($files['full_path']); return false; else: return $files; endif; }//end of multiple_upload() and this is the code in my controller if(!$s_thumb = multiple_upload('small_thumb', 'sources/images/', 'gif|jpg|jpeg|jpe|png', 1024)): //http://www.matisse.net/bitcalc/ $data['feedback'] = '<div class="error">Could not upload the small thumbnail!</div>'; $error = TRUE; endif; if(!$main_file = multiple_upload('main_file', 'sources/items/', 'zip', 307200)): $data['feedback'] = '<div class="error">Could not upload the main file!</div>'; $error = TRUE; endif;

    Read the article

  • Does Windows DFS always keep some files backlogged?

    - by Badger
    I have been monitoring our DFS backlog and I have noticed that is hasn't really dropped below 1000 or so files. I am assuming this means it needs to use more bandwidth. So starting last night I allowed it to use 512Kbps between 6pm and 4am, when it used to only get 128Kbps. I noticed a large drop, but it never went below about 1500 files. I just want to be sure my conclusion about needing to use more bandwidth is correct before I tell my boss about it. I have included a graph of the data showing my stats from yesterday afternoon and last night. DFS Backlog Graph

    Read the article

  • Taking ownership of trustedinstaller files?

    - by P a u l
    vista32-sp1: I am unable to delete some files on my system that were installed with 'special permissions' by 'trustedinstaller'. I find the usual help suggestion to use 'takeown' is not working, all I get is access denied. I refuse to believe there isn't some way to delete these files, or that microsoft has finally acheived their perfect security filesystem. This is NOT a case of a file being locked by a process. If this is all it was, I could solve this by myself. I know there are some recommended unlocking programs and they might do some sort of file system trick, but I would like to know what my possible direct actions might be. If a 3rd party program can 'unlock' a file, I want to know the mechanism. But like I said 'takeown' at the command line is not working for this.

    Read the article

  • Format Factory not working for mov files

    - by LanguaFlash
    I have attempted multiple destination formats with no success. I am using Format Factor 2.30. I drag and drop a .mov file into FF, select all to mpg (or any format) then click Start. It thinks for a couple seconds and then jumps to 100%. When I open the file I get no sound or picture from the video. These mov files were created with a Canon SX10 camera. (It is my brother's camera so I'm not sure of the model.) Any suggestions? TMPGEnc is able to convert the files with the QTReader plug in so the video isn't corrupted or something. Thanks. Jeff

    Read the article

  • Checkout repo from SVN but use local files to populate

    - by aidan
    I have an SVN server on our development server, and I release to our production server using rsync. It not ideal, but it's worked so far. Anyway, I've finally got the SVN client installed on the production server and I want to start using that to copy files from development to production. My problem is this, I don't want to check all the data out of development when I already have it on the production server. Is there a way to "checkout" a repository, but use the files that are already on the production server (and force it to assume they are the head versions for example)? Thanks.

    Read the article

  • Deploy Connectivity Platform Folder/Files

    - by Dave
    I'm using the Microsoft.SmartDevice.Connectivity features, but I do not know how to get the CoreCon Folder and files under "C:\Documents and Settings\All Users\Application Data\Microsoft\corecon\1.0\1033" deployed as part of an install project. The "new DatastoreManager(1033)" fails becouse users of my app do not have that folder and device xml files. Any Ideas of how to install or deploy those files? I assume I have those as part of an API I've installed, but can not pinpoint the source. Thanks

    Read the article

  • icacls in windows 7 does not give full permission to write files in root drive

    - by Menuta
    icacls in windows 7 does not give full permission to write files in root drive. We have a very old application based on Omnis7 that needs to create and read/write files on drive C: when running as a restricted user. In Windows XP to give this permission is quite trivial using cacls. cacls C:\ /G Everyone:(C) The equivalent icacls in Windows 7 will not work. icacls C:\ /Grant Everyone:(M) I have also tried the following. icacls C:\ /Grant Everyone:(F) icacls C:\ /Grant Domain\user:(F) trying to create file with a restricted user gives this C:\>copy nul text.txt Access is denied. 0 file(s) copied. After applying the icacls permissions above the result changes to this. C:\>copy nul text.txt A required privilege is not held by the client. 0 file(s) copied. Is this an issue with the way I am applying the permissions? Or is Window 7 being extremely strict?

    Read the article

  • Zip up web page groups to view in browsers

    - by Arlen Beiler
    There should be a standard for saving viewing bunches of webpages as a website. For instance, say I have a whole bunch of pages, such as I get from the WordPress plugin "Really Static" which saves an entire site, and I have all the links start with a slash. Now, I can't really use those links if I am reading it from the file system. If there would be a standard where we could zip up files, give them a unique extension (like "hzip" for html zip), and open the file with any browser, which would display it as though the root of that file were the root of the pages. "http://example.com" The links would all work right. This would facilitate sharing and copying groups of webpages.

    Read the article

  • php, user-uploaded files, version control, and website deployment

    - by user151841
    I have a website that I regularly update the code to. I keep it in version control. When I want to deploy a new version of the site, I do an export and then symlink the served directory name to the directory of the deployment. There is a place where users can upload files, and I noticed once that, after I had deployed a new version, the user files were gone! Of course, I hadn't added them to the repository, and since the served site was from an export, they weren't uploaded into a version-controlled directory anyways. PHP doesn't yet have integrated svn functionality, so I couldn't do much programmatically to user uploaded files. My solution was to create an additional website, files.website.com, which sits in a parallel directory to the served website, and is served out of a directory that is under version control. That way they don't get obliterated when I do an upgrade to the website. From time to time, I manually add uploaded files to the svn project, deleted user-deleted ones, and commit the new version. I'm working on a shell script to run from cron to do this, but it isn't my forte, so it's on the backburner as it's not a pressing need. Is there a better way to do this?

    Read the article

  • Getting Classic ASP to work in .js files under IIS 7

    - by Abdullah Ahmed
    I am moving a clients classic asp webapp to a new IIS7 based server. The site contains some .js files which have javascript but also classic asp in <% % tags which contains a bunch of conditional statements designed to spit out pieces of javascript based on session state variables. Here's a brief example of what the file could be like.... var arrHOFFSET = -1; var arrLeft ="<"; var arrRight = ">"; <% If ((Session("dashInv") = "True") And ((Session("systemLevelStaff") = "4") Or (Session("systemLevelCompany") = "4"))) Then %> addMainItem("/MgmtTools/WelcomeInventory.asp?wherefrom=salesMan","",81,"center","","",0,0,"","","","",""); <% Else %> <% If (Session("dashInv") = "False") And ((Session("systemLevelStaff") = "4") Or (Session("systemLevelCompany") = "4")) Then %> <% Else %> addMainItem("/calendar/welcome.asp","",81,"center","","",0,0,"","","","",""); <% End If %> <% End If %> defineSubmenuProperties(135,"center","center",-3,0,"","","","","","",""); Currently this file (named custom.js for example) will start throwing js errors, because the server doesnt seem to recognize the asp code in it and therefore does not parse it. I know I need to somehow specify that a .js file should also be treated like an .asp file and run through parsing it. However I am not sure how to go about doing this. Here is what I've tried so far... Under the Server node in IIS under HANDLER MAPPINGS I created a new Script Map with the following settings. Request Path: *.js Executable: C:\Windows\System32\inetsrv\asp.dll Name: ASPClassicInJSFiles Mapping: Invoke Handler only if request is mapped to : File Verbs: All verbs Access: Script I also created a similar handler under the site node itself. Under MIME Types .js is defined as application/x-javascript None of these work. If I simply rename the file to have .asp extension then things work, however this app is poorly coded and has literally 100's of files with the .js files included in them under various names and locations, so rename, search and replace is the last option I have.

    Read the article

  • Join .doc files into one .doc (with keeping the original format of every document)

    - by Shiki
    I have about ~50 .doc files, that look perfect (they are extracted with Able2Extract). Now I want to join these 50 files into one huge .doc. I've tried using Word's in-built "Insert" feature, but that messed up the whole format. I want to keep everything I have. Like just document1 - document2 - document3. Nothing "intelligent" or "smart" needed during the conversion, just the capability of joining them. (Thus making them all searchable, that's the ultimate aim.) I don't mind if the method/solution applies a single blank page at every document end either.

    Read the article

  • Boot Camp fails to create a Windows partition because it can't move files

    - by Jens Bannmann
    I'm running Mac OS X 10.6 (Snow Leopard) on a mac with a 320 GB drive, 167GB free space, and I can't get Boot Camp running. The wizard starts creating the Windows partition, but fails with a message claiming it cannot move some files. The message suggests to back up my hard disk, reformat it, restore my files, and re-run Boot Camp wizard. The problem is: Though I do have backups (Time Machine), I don't feel like formatting my hard disk right now :-) I found a thread in some forum discussing this problem. The suggestion was to defragment my volume with iDefrag, and lots of people claimed that solved the issue. So I went ahead and got iDefrag 1.7.1, created a bootable DVD and chose the "compact" setting recommended before partitioning - but still no luck with Boot Camp! So how do I get this working? Fun note: last year, I briefly set up Boot Camp with 10.5, and it worked perfectly. Probably I did not use that much hard disk space back then...

    Read the article

  • nginx: php-fastcgi running but php files not executing

    - by Daniel
    I have recently set up a nginx server with PHP running as FastCGI process. The server is running with HTML files however PHP files are downloading instead of displaying and PHP code is not processed. This is what I have in nginx.conf: server { listen 80; server_name pubserver; location ~ \.php$ { root /usr/share/nginx/html; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /usr/share/nginx/html$fastcgi_script_name; include fastcgi_params; } } The command netstat -tulpn | grep :9000 displays the following which indicates php-fastcgi is running and listening on port 9000: tcp 0 0 127.0.0.1:9000 0.0.0.0:* LISTEN 2663/php-cgi If it's if any importance my server is running on CentOS 6 and I installed nginx and PHP using the repositories from The Fedora Project.

    Read the article

  • FTP transfer timeouts while uploading small files

    - by Hamed Momeni
    I have this problem that when I need to transfer some files (mostly small files < 100KB) the connections time out. Well actually it uploads one file and it fails on the next until my client reconnects to the server and the same thing happens over and over again. I googled the problem and some said that switching from passive mode to active mode could solve the it but it didn't work for me. Even continuously pinging the server to keep the connection alive was to no avail. P.S. I have root access to the server. Update: I'm running ProFTPD on a CentOS vps. I tried a few clients (FireFTP, FileZilla) all having the same problem.

    Read the article

  • Using a script that uses Duplicity + S3 excluding large files

    - by Jason
    I'm trying to write an backup script that will exclude files over a certain size. If i run the script duplicity gives an error. However if I copy and paste the same command generated by the script everything works... Here is the script #!/bin/bash # Export some ENV variables so you don't have to type anything export AWS_ACCESS_KEY_ID="accesskey" export AWS_SECRET_ACCESS_KEY="secretaccesskey" export PASSPHRASE="password" SOURCE=/home/ DEST=s3+http://s3bucket GPG_KEY="gpgkey" # exclude files over 100MB exclude () { find /home/jason -size +100M \ | while read FILE; do echo -n " --exclude " echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ " done } echo "Using Command" echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST" duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST # Reset the ENV variables. export AWS_ACCESS_KEY_ID= export AWS_SECRET_ACCESS_KEY= export PASSPHRASE= When the script is run I get the error; Command line error: Expected 2 args, got 6 Where am i going wrong??

    Read the article

  • Subversion: Adding files to the project

    - by Ran
    Hi I am using library xyz where the files exists in folder xyz, and I want to update the files (eg. a upgrade to a new version), can I just copy the new xyz folder into my project using the file browser? The folder has both files and directories. /Subversion noob

    Read the article

  • IIS Directory listing doesn't reconize .mkv files

    - by Buckley
    Hello I use the directory listing function in IIS to upload a bunch of files for friends and family to easy access and download. My problem is that .mkv files it lists but when you click it i get a 'The page cannot be found'. Ive tried relocating the file and renaming it but i get the same error each time. Why does it do this? Its only my .mkv files everything else works perfectly. Thanks in advance.

    Read the article

  • Executed PHP files are stale unitl "touched" (Symlinked NFS mount as web root)

    - by mmattax
    We have a PHP application that has 3 web servers (running Nginx and Apache). The web server's directory root are symlinked directories that point to an NFS mount. For example: web01 has an NFS mount at /data/webapp, which is symlinked to /home/webapp. Apache serves content from /home/webapp/www. We also use ACP for our PHP opcode cache. When we deploy code, we SCP an archive file to the NFS server and extract it. Since upgrading RedHat 6, when we deploy our code the webserver execute "stale" PHP files until touch is run on the PHP files. We thought that APC might be causing a problem, but the issue exists, even after clearing the opcode cache. Any ideas on how to diagnose why the stale PHP code is being executed?

    Read the article

  • Grand Central Strategy for Opening Multiple Files

    - by user276632
    I have a working implementation using Grand Central dispatch queues that (1) opens a file and computes an OpenSSL DSA hash on "queue1", (2) writing out the hash to a new "side car" file for later verification on "queue2". I would like to open multiple files at the same time, but based on some logic that doesn't "choke" the OS by having 100s of files open and exceeding the hard drive's sustainable output. Photo browsing applications such as iPhoto or Aperture seem to open multiple files and display them, so I'm assuming this can be done. I'm assuming the biggest limitation will be disk I/O, as the application can (in theory) read and write multiple files simultaneously. Any suggestions? TIA

    Read the article

  • Can I grant permissions on files in windows 7 using a security identifier from another machine

    - by Thomas
    I have an external hard drive, and I wish to grant permissions on some files to users from 2 different computers without having to hook it up to the 2 different computers. I know the SID of the user on the other computer, I'd like to know if and how I can grant permissions to files using the SID. I'm running Windows 7 Professional 64 bits, and "The Other" computer Win 7 Home Premium 64 bits, they are not in a domain, but separate computers on a home network (not even same homegroup). Note: Duplicated question with: Is there a way to give NTFS file permissions to users from other Windows installations?

    Read the article

  • Delete specific files after installation using visual studio setup project

    - by Vadiklk
    I have this problem. I want to build an installer for my c# solution, that will be placed in a folder with other installation folders and files that are needed to be copied to the installed folder. So that is easy, I just copy them to the folder I create using the folder structure I want. Now, I want also to install another program and run a .exe file I've created to unzip some files for me. For that I need to copy 2 .exe files and 2 dlls (for the exes) to the folder to which I am installing and create 2 custom actions that will use them. That I've managed to do. After that I want to delete those 4 extra files, as the user does not need them and shouldn't even be aware they are there. How to do so? I couldn't find a way in the built in setup project preferences + I do not know how to make a custom installer class. A bonus question, is how to make the other installer (one of the .exe files is just a plain installer) install quietly to any path? I do not want the user to see an installer pop out of my program installer. Thanks!

    Read the article

< Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >