Search Results

Search found 69034 results on 2762 pages for 'file locking'.

Page 21/2762 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • why can't i delete this file on my pc

    - by ooo
    i am trying to install some software and it requires removing the following file: cd %SYSTEMROOT%\assembly\GAC rename Microsoft.Office.Interop.Outlook Microsoft.Office.Interop.Outlook.Old the problem is when i do the rename it looks like its successful but it doesn't seem to actually delete Microsoft.Office.Interop.Outlook i tried to explicitally do: del Microsoft.Office.Interop.Outlook and again it looks like it works but then when i do a "dir" i still see that file. Do you have any suggestions on why this wouldn't allow me to delete this file. I have outlook closed when i am doing this so i wouldn't think outlook is locking the file.

    Read the article

  • File Acces\Rename Issue

    - by Moon .
    Guys, i am having serious problem with media files (only video files). When i try to rename or delete a Files or the Folders that contained media files, i get an error. "Access Denied .... Some other program might be using this file..... bla bla bla" Previously for this knida problem i had a freeware "Unlocker" that used to work. What "Unlocker" used to do was simply kill the process that is using the file. But in my case, now that is, the process using the files is "Explorer.exe" W.T.F... A strange behavior, when i copy paste the same file. The copied file, I can rename it, i can delete it, i can shit on it if i want to.... but what is wrong with the original file. Check the following list. It will give you a better idea of my situation. I am using Win XP SP 2 I have got Div Player installed (latest) The media files are from both, my hard drive, created by many of my previous window installations. And new downloaded files. So Files ownership or Privacy is not an issue. I may not be able to rename\delete those files one moment, the next moment, in the next try, i might succeed. The problem is a major F. i can't figure it out. Please F1! its getting on my nerves.

    Read the article

  • Recommend a web file sharing software please

    - by Baczek
    I'm looking for a web platform to put company files at. My requirements are: should be accessible via a browser should be open source must be installable (dropbox is a no-go) must have an option to put a access time limit on a file must perform garbage collection automatically after a file expires must be able to mark files as public or private an option to protect a file via a pin-code for users without accounts in the system would be nice to have The problem is I don't even know what to search for - all my googling results in either complete groupware solutions or p2p file sharing software. If such a thing doesn't exist, please don't hestitate to say so, so I can crawl to a corner and cry myself to sleep. TIA

    Read the article

  • read file in shell script

    - by moata_u
    how can i read file in shell script , then assign each line to an variable that i can use later ,,,(am thinking in way to load an default setting from file) i already try : process (){ } FILE='' read -p "Please enter name of default file : " FILE if [ ! -f $FILE ]; then echo "$FILE : does not exists " exit 1 elif [ ! -r $FILE ]; then echo "$FILE : can not read " fi exec 0<"$FILE" n=0 while read -r line do (assign each line to an variable) done

    Read the article

  • Can't delete file from windows 7

    - by r.s.mahanti
    I downloaded a torrent file from internet. The problem is it's size is showing 0 bytes. I tried to scan it with antivirus, upload it to virus total, delete it but it's showing the file is not found. I tried to delete it in safe mode also but no success. Can anybody explain me, what can be reason for this and what is the way to delete this file? Thanks in advance. My operating system is windows 7. EDIT : the name of the file is "[Torrentreactor.to] - Site Translator 4.06.torrent."

    Read the article

  • How can I reorder parts of a video file

    - by sandeep
    I have download a mkv movie file which gave me 3 files suffixed .001, .002, and .003. When i join them together with different tools like winrar, 7zip, hjsplit, concatenated file shows only last 40 min/1.22 hrs of the total length of the movie. If I play all the (.001, .002, .003) parts with vlc player, I can see that .003 is the first part of the video and .001 is the last part. Can anyone tell me how can join this parts of movie with correct position or how I can convert .003 file into .001 file.

    Read the article

  • Slow network file copy on Windows 7

    - by Jason V.
    I wrote a program that uses xcopy to transfer files (usually between 1KB and 2MB) over our intranet. Usually, I am copying files from my host machine (Windows 7 x64) to a VMWare virtual machine running Windows Server 2008 (the VM is running on my host machine, if that matters). On Windows XP, the file transfers usually only require a few seconds to complete. But on my Windows 7 machine, the transfer of the first file (1.5 MB) takes around 1.5 minutes to complete. This is true whether I use xcopy, robocopy, or programmatically using File.Copy(). I noticed that if I use File.Copy, the first transfer is very slow and subsequent transfers are much faster. Any clue how I can speed up the process? Is there a setting in Windows 7 (or server 2008) that I could try?

    Read the article

  • Windows OS level file open event trigger

    - by john
    Colleagues, I have need to run a script/program on certain basic OS level events. In particular when a file in Windows is opened. The open may be read-only or to edit, and may be initiated by a number of means, either from windows explorer (open or ), be selected from a viewing or editing application from the native file chooser, or drag-n-drop into an editing or viewing application. Further, i need the trigger to "hold" the event from completing the action until the runtime on the program has completed. The event handler program may return a pass state, or fail state. If fail state has been returned, then the event must disallow the initially requested action. Lastly, I need to add to the file in question a property or attribute that will contain metadata that will be used by the above event trigger handler program to make a determination as to the pass/fail condition that will ultimately determine if the user is permitted to open the file. Please note that this is NOT a windows event log situation, but one at the OS level file open event. thanks very much for your help. regards, j

    Read the article

  • Rebuilding a file if files have changed

    - by Todd Strauch
    I'm having a hard time wrapping my head around this. It seems trivial, but I'm not getting it. I have two files. If either of those two files changes, I want to rebuild one of them. Essentially: if file a changes or file b changes then file {"a": content => template('a.erb', 'b.erb'), } I know I can audit a file for change, I just don't know how to include that within a conditional. Can anyone point me in the right direction?

    Read the article

  • C file read leaves garbage characters

    - by KJ
    Hi. I'm trying to read the contents of a file into my program but I keep occasionally getting garbage characters at the end of the buffers. I haven't been using C a lot (rather I've been using C++) but I assume it has something to do with streams. I don't really know what to do though. I'm using MinGW. Here is the code (this gives me garbage at the end of the second read): include include char* filetobuf(char *file) { FILE *fptr; long length; char *buf; fptr = fopen(file, "r"); /* Open file for reading */ if (!fptr) /* Return NULL on failure */ return NULL; fseek(fptr, 0, SEEK_END); /* Seek to the end of the file */ length = ftell(fptr); /* Find out how many bytes into the file we are */ buf = (char*)malloc(length+1); /* Allocate a buffer for the entire length of the file and a null terminator */ fseek(fptr, 0, SEEK_SET); /* Go back to the beginning of the file */ fread(buf, length, 1, fptr); /* Read the contents of the file in to the buffer */ fclose(fptr); /* Close the file */ buf[length] = 0; /* Null terminator */ return buf; /* Return the buffer */ } int main() { char* vs; char* fs; vs = filetobuf("testshader.vs"); fs = filetobuf("testshader.fs"); printf("%s\n\n\n%s", vs, fs); free(vs); free(fs); return 0; } The filetobuf function is from this example http://www.opengl.org/wiki/Tutorial2:_VAOs,_VBOs,_Vertex_and_Fragment_Shaders_%28C_/_SDL%29. It seems right to me though. So anyway, what's up with that?

    Read the article

  • How do I find out what process Id and thread id / name has a file open

    - by peter
    Hi All, I am using C# in an application and am having some problems with a file becoming locked. The piece of code does this, while (true) { Read a packet from a socket (with data in it to add to the file) Open a file Writes data to it Close a file } But in the process the file becomes locked. I don't really understand how, we are are definately catching and reporting exceptions so I don't see how the file doesn't get closed every time. My best guess is that something else is opening the file, but I want to prove it. Can someone please provide a piece of code to check whether the file is open and if so report what processid and threadid has the file open. For example if I had this code, StreamWriter streamWriter1 = new StreamWriter(@"c:\logs\test.txt"); streamWriter1.WriteLine("Test"); // code to check for locks?? StreamWriter streamWriter2 = new StreamWriter(@"c:\logs\test.txt"); streamWriter1.Close(); streamWriter2.Close(); That will throw an exception because the file is locked when we try and open it the second time. So where the comment is what could I put in there to report that the current app (process Id) and the current thread (thread Id) have the file locked? Thanks.

    Read the article

  • MVC 4 Beta with Mobile Project FIle Upload does not work

    - by Jim Shaffer
    I am playing around with the new MVC 4 beta release. I created a new web project using the Mobile Application template. I simply added a controller and a view to upload a file, but the file is always null in the action result. Is this a bug, or am I doing something wrong? Controller Code: using System.IO; using System.Web; using System.Web.Mvc; namespace MobileWebExample.Controllers { public class FileUploadController : Controller { public ActionResult Index() { return View(); } [AllowAnonymous] [HttpPost] [AcceptVerbs(HttpVerbs.Post)] public ActionResult Upload(HttpPostedFileBase file) { int i = Request.Files.Count; if (file != null) { if (file.ContentLength > 0) { var fileName = Path.GetFileName(file.FileName); var path = Path.Combine(Server.MapPath("~/App_Data/uploads"), fileName); file.SaveAs(path); } } return RedirectToAction("Index"); } } } And the view looks like this: @{ ViewBag.Title = "Index"; } <h2>Index</h2> <form action="@Url.Action("Upload")" method="post" enctype="multipart/form-data"> <label for="file">Filename:</label> <input type="file" name="file" id="file" /> <input type="submit" value="Submit" /> </form>

    Read the article

  • larger file upload problem with php

    - by chris
    I need to upload a csv file to a server. works fine for smaller files but when the file is 3-6 meg its not working. $allowedExtensions = array("csv"); foreach ($_FILES as $file) { if ($file['tmp_name'] > '') { if (!in_array(end(explode(".", strtolower($file['name']))), $allowedExtensions)) { die($file['name'].' is an invalid file type!<br/>'. '<a href="javascript:history.go(-1);">'. '&lt;&lt Go Back</a>'); } if (move_uploaded_file($file['tmp_name'], $uploadfile)) { echo "File is valid, and was successfully uploaded.\n"; } else { echo "Possible file upload attack!\n"; } echo "File has been uploaded"; } //upload form <form name="upload" enctype="multipart/form-data" action="<? echo $_SERVER['php_self'];?>?action=upload_process" method="POST"> <!-- MAX_FILE_SIZE must precede the file input field --> <input type="hidden" name="MAX_FILE_SIZE" value="31457280" /> <!-- Name of input element determines name in $_FILES array --> Send this file: <input name="userfile" type="file" /> <input type="submit" value="Send File" /> </form> I have also added this to htaccess php_value upload_max_filesize 20M php_value post_max_size 20M php_value max_execution_time 200 php_value max_input_time 200 Where am i going wrong?

    Read the article

  • Python file iterator over a binary file with newer idiom.

    - by drewk
    In Python, for a binary file, I can write this: buf_size=1024*64 # this is an important size... with open(file, "rb") as f: while True: data=f.read(buf_size) if not data: break # deal with the data.... With a text file that I want to read line-by-line, I can write this: with open(file, "r") as file: for line in file: # deal with each line.... Which is shorthand for: with open(file, "r") as file: for line in iter(file.readline, ""): # deal with each line.... This idiom is documented in PEP 234 but I have failed to locate a similar idiom for binary files. I have tried this: >>> with open('dups.txt','rb') as f: ... for chunk in iter(f.read,''): ... i+=1 >>> i 1 # 30 MB file, i==1 means read in one go... I tried putting iter(f.read(buf_size),'') but that is a syntax error because of the parens after the callable in iter(). I know I could write a function, but is there way with the default idiom of for chunk in file: where I can use a buffer size versus a line oriented? Thanks for putting up with the Python newbie trying to write his first non-trivial and idiomatic Python script.

    Read the article

  • Handling file renames in git

    - by Greg K
    I'd read that when renaming files in git, you should commit any changes, perform your rename and then stage your renamed file. Git will recognise the file from the contents, rather than seeing it as a new untracked file, and keep the change history. However, doing just this tonight I ended up reverting to git mv. > $ git status # On branch master # Changes to be committed: # (use "git reset HEAD <file>..." to unstage) # # modified: index.html # Rename my stylesheet in Finder from iphone.css to mobile.css > $ git status # On branch master # Changes to be committed: # (use "git reset HEAD <file>..." to unstage) # # modified: index.html # # Changed but not updated: # (use "git add/rm <file>..." to update what will be committed) # (use "git checkout -- <file>..." to discard changes in working directory) # # deleted: css/iphone.css # # Untracked files: # (use "git add <file>..." to include in what will be committed) # # css/mobile.css So git now thinks I've deleted one CSS file, and added a new one. Not what I want, lets undo the rename and let git do the work. > $ git reset HEAD . Unstaged changes after reset: M css/iphone.css M index.html Back to where I began. > $ git status # On branch master # Changes to be committed: # (use "git reset HEAD <file>..." to unstage) # # modified: index.html # Lets use git mv instead. > $ git mv css/iphone.css css/mobile.css > $ git status # On branch master # Changes to be committed: # (use "git reset HEAD <file>..." to unstage) # # renamed: css/iphone.css -> css/mobile.css # # Changed but not updated: # (use "git add <file>..." to update what will be committed) # (use "git checkout -- <file>..." to discard changes in working directory) # # modified: index.html # Looks like we're good. So why didn't git recognise the rename the first time around when I used Finder?

    Read the article

  • Unit Testing Hibernate's Optimistic Locking (within Spring)

    - by Michal Bachman
    I'd like to write a unit test to verify that optimistic locking is properly set up (using Spring and Hibernate). I'd like to have the test class extend Spring's AbstractTransactionalJUnit4SpringContextTests. What I want to end up with is a method like this: @Test (expected = StaleObjectStateException.class) public void testOptimisticLocking() { A a = getCurrentSession().load(A.class, 1); a.setVersion(a.getVersion()-1); getCurrentSession().saveOrUpdate(a); getCurrentSession().flush(); fail("Optimistic locking does not work"); } This test fails. What do you recommend as a best practice? The reason I am trying to do this is that I want to transfer the version to the client (using a DTO). I want to prove that when the DTO is sent back to the server and merged with a freshly loaded entity, saving that entity will fail if it's been updated by somebody else in the meantime.

    Read the article

  • Ada: Adding an exception in a separate procedure when reading a file

    - by yCalleecharan
    This is a follow-up of this question: Ada: reading from a file . I would like to add an exception that checks if the file that I'm opening actually exists or not. I have made a separate procedure to avoid code clutter. Here is the main code test_read.adb: with Ada.Text_IO; use Ada.Text_IO; with Ada.Long_Float_Text_IO; with Ada.Float_Text_IO; procedure Test_Read is Input_File : File_Type; Value : Long_Float; procedure Open_Data (File : in Ada.Text_IO.File_Type; Name : in String) is separate; begin Ada.Text_IO.Open (File => Input_File, Mode => Ada.Text_IO.In_File, Name => "fx.txt"); while not End_Of_File (Input_File) loop Ada.Long_Float_Text_IO.Get (File => Input_File, Item => Value); Ada.Long_Float_Text_IO.Put (Item => value, Fore => 3, Aft => 5, Exp => 0); Ada.Text_IO.New_Line; end loop; Ada.Text_IO.Close (File => Input_File); end Test_Read; And here is the separate body test_read-open_data.adb of the procedure Open_Data: separate(test_read) procedure Open_Data (File : in Ada.Text_IO.File_Type; Name : in String) is --this procedure prepares a file for reading begin begin Ada.Text_IO.Open (File => File, Mode => Ada.Text_IO.In_File, Name => Name); exception when Ada.Text_IO.Name_Error => Ada.Text_IO.Put(File => Standard_Error, Item => "File not found."); end; end Open_Data; On compilation I get an error message in the separate body test_read-open_data.adb: actual for "File" must be a variable How to fix this? Thanks a lot...

    Read the article

  • Clojure: Equivalent to Common Lisp READ function?

    - by jkndrkn
    Hi there. When I want to read in an S-expression stored in a file into a running Common Lisp program, I do the following: (defun load-file (filename) "Loads data corresponding to a s-expression in file with name FILENAME." (with-open-file (stream filename) (read stream))) If, for example, I have a file named foo.txt that contains the S-expression (1 2 3), the above function will return that S-expression if called as follows: (load-file "foo.txt"). I've been searching and searching and have not found an equally elegant solution in Clojure. Any ideas? Thanks!

    Read the article

  • Why use SyncLocks in .NET for simple operations when Interlocked class is available?

    - by rwmnau
    I've been doing simple multi-threading in VB.NET for a while, and have just gotten into my first large multi-threaded project. I've always done everything using the Synclock statement because I didn't think there was a better way. I just learned about the Interlocked Class - it makes it look as though all this: Private SomeInt as Integer Private SomeInt_LockObject as New Object Public Sub IntrementSomeInt Synclock SomeInt_LockObject SomeInt += 1 End Synclock End Sub Can be replaced with a single statement: Interlocked.Increment(SomeInt) This handles all the locking internally and modifies the number. This would be much simpler than writing my own locks for simple operations (longer-running or more complicated operations obviously still need their own locking). Is there a reason why I'd rolling my own locking, using dedicated locking objects, when I can accomplish the same thing using the Interlocked methods?

    Read the article

  • Effective methods for reading and writing large files in C

    - by Bertholt Stutley Johnson
    I'm writing an application that deals with very large user-generated input files. The program will copy about 95 percent of the file, effectively duplicating it and switching a few words and values in the copy, and then appending the copy (in chunks) to the original file, such that each block (consisting of between 10 and 50 lines) in the original is followed by the copied and modified block, and then the next original block, and so on. The user-generated input conforms to a certain format, and it is highly unlikely that any line in the original file is longer than 100 characters long. Which would be the better approach? a) To use one file pointer and use variables that hold the current position of how much has been read and where to write to, seeking the file pointer back and forth to read and write; or b) To use multiple file pointers, one for reading and one for writing. I am mostly concerned with the efficiency of the program, as the input files will reach up to 25,000 lines, each about 50 characters long. Thanks!

    Read the article

  • DataReader Behaviour With SQL Server Locking

    - by Graham
    We are having some issues with our data layer when large datasets are returned from a SQL server query via a DataReader. As we use the DataReader to populate business objects and serialize them back to the client, the fetch can take several minutes (we are showing progress to the user :-)), but we've found that there's some pretty hard-core locking going on on the affected tables which is causing other updates to be blocked. So I guess my slightly naive question is, at what point are the locks which are taken out as a result of executing the query actually relinquished? We seem to be finding that the locks are remaining until the last row of the DataReader has been processed and the DataReader is actually closed - does that seem correct? A quick 101 on how the DataReader works behind the scenes would be great as I've struggled to find any decent information on it. I should say that I realise the locking issues are the main concern but I'm just concerned with the behaviour of the DataReader here.

    Read the article

  • Record locking problem between linux and Windows

    - by PabloG
    I need to run a bunch of old DOS FoxPro / Clipper applications in linux under DOSEMU. The programs access their "databases" located on a network server (could be a Windows or Linux server) Actually, the programs ran fine, but I cannot manage to make the record locking work as supposed: I can run a program in two terminals (or the server and any terminal for instance) and lock the same record in both. Now, I'm using Tiny Core Linux as terminal and Windows XP as server, accesing the shared files via CIFS and the latest DOSEMU (1.4.0), but I tried with various combinations of server (Ubuntu 7 to 9, Damn Small Linux, XP) <- protocol (CIFS, samba, various versions of smbclient) <- client (same as server) with no luck I tried to configure the server part to work without oplocks in samba (after reading the entire O'Reilly Samba book locking chapter in http://oreilly.com/catalog/samba/chapter/book/ch05_05.html ) and in XP (\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\LanmanServer\Parameters\UseOpportunisticLocking = 0) but the problem persist. Any ideas? TIA, Pablo

    Read the article

  • Do MySQL Locked Tables affect related Views?

    - by CogitoErgoSum
    So after reading http://stackoverflow.com/questions/1415602/performance-in-pdo-php-mysql-transaction-versus-direct-execution in regards to performance issues I was thinking about I did some research on locking tables in MySQL. On http://dev.mysql.com/doc/refman/5.0/en/table-locking.html Table locking enables many sessions to read from a table at the same time, but if a session wants to write to a table, it must first get exclusive access. During the update, all other sessions that want to access this particular table must wait until the update is done. This part struck me particularly becuase most of our queries will be updates rather than inserts. I was wondering if one created a table called foo on which all updates/inserts were carried out and then a view called foo_view (A copy of foo, or perhaps foo and a linkage of several other tables plus foo) on which all selects occured, would this locking issue still occur? That is, would SELECT quries on foo_view still have to wait for an update to finish on foo?

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >