Search Results

Search found 40999 results on 1640 pages for 'duplicate files'.

Page 111/1640 | < Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >

  • Ubuntu crashed during the update to 12.04, not I can't recover my files, help please

    - by mrah
    I'm pretty new to Linux and I've only installed Ubuntu as I couldn't afford to buy Windows, worked well and I liked it. But I chose to upgrade it to the newest version after a prompt. The update froze and the machine was unresponsive which forced me to hard reboot it. Now nothing seems to load and I've reached my wits end (mainly cos I'm lost in all the command lines). I've decided to try and recover my data from the hard drive, only two folders, by selecting the try Ubuntu option when I insert the OS CD into the machine. The problem I'm experiencing now is it won't let me copy my folders, I get a 'The folder contents could not be displayed. You do not have the permissions necessary to view the contents of "folder_name".' Does anyone know how I can recover this data?

    Read the article

  • Standardize SQL Server Installations with Configuration Files

    If you have a requirement to install multiple SQL Server instances with the same settings, you most likely want to do it without following the numerous manual installation steps. The below tip will guide you through how to install a SQL Server instance with less effort. The Future of SQL Server MonitoringMonitor wherever, whenever with Red Gate's SQL Monitor. See it live in action now.

    Read the article

  • Ubuntu does not recognize install files (easytether)

    - by Esbilick
    I am running Ubuntu 12.4 64amd. I cant get easy tether to install regardless of which version I use (32 or 64bit.) I keep getting an error when I try to run it in the command line. It tells me it not a valid deb file. If I click it on the desktop the instal manager comes up but the install button is not highlighted for me to click to install it that way. Please help. I'll be using my phone as internet modem

    Read the article

  • Other than using `split`, is there a way around the Apache 2.0 maximum file size limit of 2GB?

    - by warren
    I have some ISOs that need to be available across a WAN, so we are using an http server to host them (allows for non-authenticated, read-only access (beyond being on the VPN) to the data store). The server the ISOs reside on is running CentOS 4, and Apache 2.0.58. Is there a way around the 2GB filesize limit with Apache 2.0 without using the split utility to chunk the ISOs down to a less-than-2GB size?

    Read the article

  • What is the best tool to aggregate traffic stats from multiple nginx servers?

    - by gekkz
    The setup: 2 or more nginx machines each machine has the same virtual hosts traffic is load balanced via DNS to each machine I need to figure out what are the best tools to use to get some traffic stats, mostly interested in amount of hits and total traffic in gigabytes. Obviously, the log information will come from nginx, formatted like this: log_format main '$remote_addr $host $remote_user [$time_local] "$request" ' '$status $body_bytes_sent "$http_referer" "$http_user_agent" "$gzip_ratio"';

    Read the article

  • most simple way to get files on a server

    - by acidzombie24
    I am on windows and my server is linux. I would like to grab files from the server automatically with a script. Maybe execute a bash script remotely as well but maybe i dont need that. I need to connect securely and i would like some kind of password so not anyone can connect. I need to download files and i'd like to get every file in a set of folders. I do not want to download them again if they exist. What is the easiest way to do this? i thought of creating a simple .NET site with data in App_Data (so it cant be reached from the outside) however i have a feeling an easier way exist. I'd like to do scp with a shell but i am on windows and also i am unsure how to iterate through folders and only get files that dont exist.

    Read the article

  • Find copies of folders? (Not files)

    - by acidzombie24
    I have a dozen of folders that are duplicates. Within them are a few dozen folders that are duplicates so i have a few thousand copies of the same files and folders. Many of them are exactly the same while others have changes in a few files. What utility can i use to delete folders that are copies of others with no changes? if one or more files in that folder have been changed i dont want it deleted (and i'd like the subfolders to have a shortcuts to a copy but thats not required). Is there a utility to do this?

    Read the article

  • Getting ANT to scp only new/changed files

    - by Artem
    I would like to optimize my scp deployment which currently copies all files to only copy files that have changed since the last build. I believe it should be possible with the current setup somehow, but I don't know how to do this. I have the following: Project/src/blah/blah/ <---- files I am editing (mostly PHP in this case, some static assets) Project/build <------- I have a local build step that I use to copy the files to here I have an scp task right now that copies all of Project/build out to a remote server when I need it. Is it possible to somehow take advantage of this extra "build" directory to accomplish what I want -- meaning I only want to upload the "diff" between src/** and build/**. Is it possible to somehow retrieve this as a fileset in ANT and then scp that? I do realize that what it means is that if I somehow delete/mess around with files on the server in between, the ANT script would not notice, but for me this is okay.

    Read the article

  • nmake makefile, linking objects files in a subfolder

    - by Gauthier
    My makefile defines a link command: prod_link = $(LINK) $(LINK_FLAGS) -o$(PROD_OUT) $(PROD_OBJS) where $(PROD_OBJS) is a list of object files of the form: PROD_OBJS = objfile1.obj objfile2.obj objfile3.obj ... objfileN.obj Now the makefile itself is at the root of my project directory. It gets messy to have object and listing files at the root, I'd like to put them in a subfolder. Building and outputing the obj files to a subfolder works, I'm doing it with suffixes and inference: .s.obj: $(ASSEMBLY) $(FLAGS) $*.s -o Objects\$*.obj The problem is to pass the Objects folder to the link command. I tried: prod_link = $(LINK) $(LINK_FLAGS) -o$(PROD_OUT) Objects\$(PROD_OBJS) but only the first file in the list of object files gets the folder's name. How can I pass the Objects subfolder to all files of my list $(PROD_OBJS)? EDIT I tried also PROD_OBJS = $(patsubst %.ss,Object\%.obj, $(PROD_SRC)) but got: makefile(51) : fatal error U1000: syntax error : ')' missing in macro invocation Stop. This is quite strange...

    Read the article

  • create a wav file from multiple wav files in delphi

    - by Bayu
    i' ve a problem in doing my final project... i'm having trouble with how to save multiple wav files into 1 wav file.. let's take an example: i have 3 wav files which are the syllables of the word "hospital" : "hos.wav", "pi.wav", and "tal.wav" (sorry if i'm wrong in determining the syllables of the words).. each of those syllable wav files contains utterances of the syllables of the word "hospital" respectively.. my task is to merge those files so that the word hospital could be reproduced from those files. and then to save the merged file to be a new wav file, let say "hospital.wav"..I've done my first task, but not with my second task... does anyone can help me? thx b4..

    Read the article

  • Large file upload into WSS v3

    - by Rubens Farias
    I'd built an WSSv3 application which upload files in small chunks; when every data piece arrives, I temporarly keep it into a SQL 2005 image data type field for performance reasons**. Problem come when upload ends; I need to move data from my SQL Server to Sharepoint Document Library through WSSv3 object model. Right now, I can think two approaches: SPFileCollection.Add(string, (byte[])reader[0]); // OutOfMemoryException and SPFile file = folder.Files.Add("filename", new byte[]{ }); using(Stream stream = file.OpenBinaryStream()) { // ... init vars and stuff ... while ((bytes = reader.GetBytes(0, offset, buffer, 0, BUFFER_SIZE)) 0) { stream.Write(buffer, 0, (int)bytes); // Timeout issues } file.SaveBinary(stream); } Are there any other way to complete successfully this task? ** Performance reasons: if you tries to write every chunk directly at Sharepoint, you'll note a performance degradation as file grows up (100Mb).

    Read the article

  • Which c# project files should I version control?

    - by DTown
    I have a project I'm looking to manually manage via perforce version control as I only have the Express edition. What I'm looking for is which files should be excluded in the version control as locking many of the files can result in a problem for visual studio compiling and debugging. What I have, so far, included. .cs files (except properties folder) .resx files .csproj files Excluded bin folder obj folder Properties folder .user file Let me know if there is something more that should be included that I have excluded or if there is a better way to do this.

    Read the article

  • How to organize a large number of objects

    - by shane
    We have a large number of documents and metadata (xml files) associated with these documents. What is the best way to organize them? Currently we put them into a series of nested folders: /repository/category/date(when they were loaded into our db)/document_number.pdf and .xml We use the path as a unique identifier for the document in our system. This is more versatile than putting them all in a single flat folder. also it is independent from our database/application, so we can reload them in case of failure. Yet, it introduces some limitations. for example we can't move the files once they've been placed in this structure, also it takes work to put them this way. What is the best practice? How websites such as Scribd deal with this problem?

    Read the article

  • Having py2exe include my data files (like include_package_data)

    - by cool-RR
    I have a Python app which includes non-Python data files in some of its subpackages. I've been using the include_package_data option in my setup.py to include all these files automatically when making distributions. It works well. Now I'm starting to use py2exe. I expected it to see that I have include_package_data=True and to include all the files. But it doesn't. It puts only my Python files in the library.zip, so my app doesn't work. How do I make py2exe include my data files?

    Read the article

  • scvmap, disco, xsd, wsdl, svcinfo and datasource files

    - by David Gray Wright
    We have a WEb Service named, let's say Foo. So there is a Foo.svc file and a code behind Foo.svc.cs. We add a silverlight project and wish to use the Foo.svc services so we add a Service Reference and call it's namespace FooBar. This creates the following files : Reference.cs Reference.svcmap Foo.xsd Foo.disco configuration.svcinfo Foo.wsdl Also various *.datasource files. Over time we update the Foo.svc and add more Web Services (methods and interfaces) and the number of files in the FooBar directory is growing. I have 26 Foo(nn).xsd files in this directory - where nn = 1 to 26. My configuration.svcinfo is upto configuration91.svcinfo. My question is this? Do any of these files need to be version controlled? Can they all be deleted each time you do a build \ deploy (as long as you do an update service reference)?

    Read the article

  • Viewing large text file in a browser

    - by MeLight
    Hi, I need to write a text file viewer (not the directory tree, but the actual file contents) for use in a browser. It will be used to view large files. I want to give the user the ability to actually ummm, browse the file, ie prev page & next page buttons, while each page will show only a portion of the file. Two question: Is there anyway to pass the file descriptor through POST (or something) so that on each page I can keep reading from an already open file, and not starting all over again (again - huge files) Is there a way to read the file backwards? Will be very useful for browsing back in a file. Any other implementation ideas are very welcome. Thanks

    Read the article

  • Flash files are playing laggy /Jerky in screensaver

    - by yogesh-chhabra
    I have developed a screen saver for MAC OS X. Screen saver plays flash files. Whene I run screen saver on display resolution higher than 1300 X 1000 then flash files play very jerky / laggy. When I run it in low resolution then files play fine.Even flash files are playing fine in desktop application. This problem is more on 10.6 Snow Leopard. I am unable to find out the reason whether it is OS screensaver engine problem or flash files problem. I know rendering increases on high resolution but there should be solution for this problem

    Read the article

  • Preventing referenced assembly PDB and XML files copied to output

    - by Jason Morse
    I have a Visual Studio 2008 C#/.NET 3.5 project with a post build task to ZIP the contents. However I'm finding that I'm also getting the referenced assemblies' .pdb (debug) and .xml (documentation) files in my output directory (and ZIP). For example, if MyProject.csproj references YourAssembly.dll and there are YourAssembly.xml and YourAssembly.pdb files in the same directory as the DLL they will show up in my output directory (and ZIP). I can exclude *.pdb when ZIP'ing but I cannot blanket exclude the *.xml files as I have deployment files with the same extension. Is there a way to prevent the project from copying referenced assembly PDB and XML files?

    Read the article

  • Drupal Filefield won't upload javascript files?

    - by hfidgen
    Hiya, I've got a site where individual pages might require some javascript or CSS files hooked into their heads. I'm trying to keep everything client side when it comes to managing this process, rather than getting on the FTP and sorting everything out in the code so I need to be able to upload css and js files. I've got CCK filefield up and running, and it works with css files, but it refuses to upload .js files. It instead seems to view every .js as ".js.txt" and then the file appears on the server as thisismyfile.js.txt Not ideal... Does anyone know how to work around this. Is it a mime type problem with Drupal or the server, or is Drupal set up to avoid script uploads and n00b hack attacks. Once the files are uploaded I intend to use PHP mode on the page or node to call drupal_add_css and drupal_add_js.

    Read the article

  • Dynamically add files to visual studio deployment project.

    - by Graeme Yeo
    I've been desperately looking for the answer to this and I feel I'm missing something obvious. I need to copy a folder full of data files into the TARGETDIR of my deployment project at compile time. I can see how I would add individual files (ie. right click in File System and go to Add-File) but I have a folder full of data files which constantly get added to. I'd prefer not to have to add the new files each time I compile. I have tried using a PreBuildEvent to copy the files: copy $(ProjectDir)..\Data*.* $(TargetDir)Data\ which fails with error code 1 when I build. I can't help but feel I'm missing the point here though. Any suggestions? Thanks in advance. Graeme

    Read the article

  • Load a file in a group objective-c Xcode

    - by okami
    I'd like to load a file from a specific Group in Xcode/Objective-c for example: I have TWO files named "SomeFile.txt" that are in two different folders (folders not groups yet) in the OS: SomeFolderOne |- SomeFile.txt SomeFolderTwo |- SomeFile.txt Inner Xcode I make two folders, and I put a REFERENCE to these two files: SomeGroupOne |- SomeFile.txt // this file is a reference to the SomeFile.txt from SomeFolderOne SomeGroupTwo |- SomeFile.txt // this file is a reference to the SomeFile.txt from SomeFolderTwo Now I want to read the txt content with: NSString *contents = [NSString stringWithContentsOfFile:@"SomeFile.tx" encoding:NSUTF8StringEncoding error:nil]; Ok it reads the 'SomeFile.txt' but sometimes the file read is from SomeGroupOne and sometimes the file is read from SomeGroupTwo. How to specify the group I want the file to be read?

    Read the article

< Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >