Search Results

Search found 89530 results on 3582 pages for 'file read'.

Page 14/3582 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • Intermittent bug - IE6 showing file as text in browser, rather than as file download

    - by Richard Ev
    In an ASP.NET WebForms 2.0 site we are encountering an intermittent bug in IE6 whereby a file download attempt results in the contents of the being shown directly in the browser as text, rather than the file save dialog being displayed. Our application allows the user to download both PDF and CSV files. The code we're using is: HttpResponse response = HttpContext.Current.Response; response.Clear(); response.AddHeader("Content-Disposition", "attachment;filename=\"theFilename.pdf\""); response.ContentType = "application/pdf"; response.BinaryWrite(MethodThatReturnsFileContents()); response.End(); This is called from the code-behind click event handler of a button server control. Where are we going wrong with this approach? Edit Following James' answer to this posting, the code I'm using now looks like this: HttpResponse response = HttpContext.Current.Response; response.ClearHeaders(); // Setting cache to NoCache was recommended, but doing so results in a security // warning in IE6 //response.Cache.SetCacheability(HttpCacheability.NoCache); response.AppendHeader("Content-Disposition", "attachment; filename=\"theFilename.pdf\""); response.ContentType = "application/pdf"; response.BinaryWrite(MethodThatReturnsFileContents()); response.Flush(); response.End(); However, I don't believe that any of the changes made will fix the issue.

    Read the article

  • file corruption on read/write 2.6.32-22-server (happens across many kernels)

    - by Jonathan
    Hi Guys, I'm having an issue where after the server has been up for a period of time (~week/few days) the server will start reading corrupt data. For instance when I run a sha1sum of a file after a fresh boot it remains the same. However after a while I will start to get segfaults and from then on whenever I read this file I get a different sha1sum. I've checked S.M.A.R.T with long tests and I've run an extended memtest86+(12 passes) My lspci is as follows: 00:00.0 Host bridge: Advanced Micro Devices [AMD] RS780 Host Bridge 00:01.0 PCI bridge: Advanced Micro Devices [AMD] RS780 PCI to PCI bridge (int gfx) 00:06.0 PCI bridge: Advanced Micro Devices [AMD] RS780 PCI to PCI bridge (PCIE port 2) 00:07.0 PCI bridge: Advanced Micro Devices [AMD] RS780 PCI to PCI bridge (PCIE port 3) 00:11.0 SATA controller: ATI Technologies Inc SB700/SB800 SATA Controller [AHCI mode] 00:12.0 USB Controller: ATI Technologies Inc SB700/SB800 USB OHCI0 Controller 00:12.1 USB Controller: ATI Technologies Inc SB700 USB OHCI1 Controller 00:12.2 USB Controller: ATI Technologies Inc SB700/SB800 USB EHCI Controller 00:13.0 USB Controller: ATI Technologies Inc SB700/SB800 USB OHCI0 Controller 00:13.1 USB Controller: ATI Technologies Inc SB700 USB OHCI1 Controller 00:13.2 USB Controller: ATI Technologies Inc SB700/SB800 USB EHCI Controller 00:14.0 SMBus: ATI Technologies Inc SBx00 SMBus Controller (rev 3c) 00:14.1 IDE interface: ATI Technologies Inc SB700/SB800 IDE Controller 00:14.3 ISA bridge: ATI Technologies Inc SB700/SB800 LPC host controller 00:14.4 PCI bridge: ATI Technologies Inc SBx00 PCI to PCI Bridge 00:14.5 USB Controller: ATI Technologies Inc SB700/SB800 USB OHCI2 Controller 00:18.0 Host bridge: Advanced Micro Devices [AMD] K10 [Opteron, Athlon64, Sempron] HyperTransport Configuration 00:18.1 Host bridge: Advanced Micro Devices [AMD] K10 [Opteron, Athlon64, Sempron] Address Map 00:18.2 Host bridge: Advanced Micro Devices [AMD] K10 [Opteron, Athlon64, Sempron] DRAM Controller 00:18.3 Host bridge: Advanced Micro Devices [AMD] K10 [Opteron, Athlon64, Sempron] Miscellaneous Control 00:18.4 Host bridge: Advanced Micro Devices [AMD] K10 [Opteron, Athlon64, Sempron] Link Control 01:05.0 VGA compatible controller: ATI Technologies Inc Radeon HD 3300 Graphics 01:05.1 Audio device: ATI Technologies Inc RS780 Azalia controller 02:00.0 Ethernet controller: Atheros Communications Atheros AR8121/AR8113/AR8114 PCI-E Ethernet Controller (rev b0) 03:00.0 FireWire (IEEE 1394): VIA Technologies, Inc. Device 3403 I could really use some help on this, do you have any idea what could cause this? It's really frustrating me as it seems to trigger entirely randomly and will not go away until I reboot. I'm also use KVM for virtualization as well as MD for software RAID on this server and the processor is a Phenom II X4 965. I don't believe it's the software raid however as this affects files also hosted on non-raid partitions so I don't know.

    Read the article

  • Ubuntu + Unable to Edit .bashrc file because of ReadOnly

    - by Napster
    To Remove Issue of WARNING: Unable to verify SSL certificate for api.heroku.com To disable SSL verification, run with HEROKU_SSL_VERIFY=disable By Googling I got few solution. One of them is added HEROKU_SSL_VERIFY=disable to .bashrc. Unfortunately, I am not able to edit that file, gives an error of 'readonly' option is set (add ! to override) !wq is used in place of :wq, but no response. Please suggest me to resolve this issue... Thanks

    Read the article

  • partially downloaded a torrent file and renamed it

    - by user2613789
    I have partially downloaded a torrent file in ubuntu and unfortunately i renamed it. and after some time i resumed the remaining torrent file.but the torrent then got downloaded with its default name and i lost the complete file information into two different files... please help me out...is there any way that i can produce the whole information into one file...i can't open either file that means both file got corrupted or so...the file is in mp4 format...please help me

    Read the article

  • Which language is more suitable heavy file tasks?

    - by All
    I need to write a script (based on basic functions) to process /image/audio/video files. The process is mainly filesystem tasks and converts. The database of files has been stored by mysql. The script is simple but cause heavy tasks on the system; for example renaming/converting/copying thousands of file in a run. The script does not read the content of files into memory, it just manage the commands for sub-processes. The main weight is on the communication with filesystem. The script will be used regularly for new files. My concern is about performance. I am thinking of Shell script a complied language like C Please advise which programming language is more suitable for this purpose and why? UPDATE: An example is to scan a folder for images, convert them with ImageMagick, move files to destination folder, get file info, then update the database. As you can see, the process has no room for optimization, and most of languages have similar APIs for popular programs like ImageMagick, MySQL, etc. Thus, it can be written in any language. I just wish to reduce resource usage by speeding up the long loop. NOTE: I know that questions about comparing languages are not favorable, but I really had problem to choose, because the problems can appear in action.

    Read the article

  • Associating File Types with AutoVue Desktop Deployment

    - by [email protected]
    Windows users take for granted that when they double click on a document or design, that it will open up in its application automatically. One of the questions I'm commonly asked is "How can I get the same behavior with AutoVue Desktop Deployment?". It's pretty easy, but there are a few tricks to doing it. Step 1: Download new jvue_direct.bat and icon The first thing you'll need to do is download a slightly modified version of jvue_direct.bat. You can find it here (Document 1075784.1) on Oracle's Support Portal. You also want to download the AV.ico file. This is the icon that will be used for all file types associated with AutoVue. Place both of these files in your <AutoVueInstallDirectory>\bin directory. Step 2: Associate File Types With AutoVue There are two ways to do this. You can do this through the Windows user interface, or you can set up a batch file to do this. Associating File Types Through Windows The way most people associate file types to an application is using the Windows user interface. You've probably tried to open a file type that Windows doesn't recognize and seen this window pop up: Although you can use this dialog to associate that file type with AutoVue, I don't recommend it. I much prefer using a batch file to associate file types with AutoVue. Associating File Types Using A Batch File There are a few good reasons to associate file types using a batch file instead of using the pop-up dialog method: If you have several file types to associate with AutoVue, it's much easier to use a batch file to do them all at once. Doing it through the Windows user interface requires having files of each type available. Using a batch file doesn't require having the files you're associating. Associating file types through the dialog may work well for one person, but what if you're an administrator doing an enterprise wide deployment of AutoVue Desktop Deployment for several hundred users? You don't want to do this manually for each user. You can have one simple batch file that's run on each user's PC to set up all the file types. You can easily associate an icon with the file types you're opening with AutoVue. To use the batch file method follow these steps: Create a file called filetype.bat using a text editor and copy and paste the following into it: @assoc .dwg=AVFile @assoc .jpg=AVFile @assoc .doc=AVFile @ftype AVFile="%~dp0jvue_direct.bat" "%%1" @reg add HKEY_CLASSES_ROOT\AVFile\DefaultIcon /v "" /f /d "%~dp0AV.ico" Change the lines starting with @assoc. Each of these lines associates a file extension with AutoVue. You can have as many @assoc lines as you want. Save this file in your <AutoVueInstallDirectory>\bin directory. Double click this file, or run it from a command prompt. Restart Windows to get the icons to show up. How Does This Work? The first three lines are creating a file type called AVFile. We are associating the extensions .dwg, .jpg, and .doc with this file type. You will want to change these lines when creating your own batch file. For example, to associate Microstation designs, which have extension .dgn, you should delete the @assoc lines above and add the line: @assoc .dgn=AVfile The line beginning with @ftype tells Windows that all AVFile type files should be opened using AutoVue Desktop Deployment. The final line associates the AutoVue icon with these file types. You may need to restart Windows to see the new icons. Warning: One Size Doesn't Fit All When deciding which file types should be associated with AutoVue, remember that there are different types of users using it. Your engineers may be pretty surprised to find that after installing AutoVue, double clicking their .dwg file opens up AutoVue instead of AutoCAD. If you have more than one type of AutoVue user, make sure you've considered what file types each user group will and will not want to be associated with AutoVue. If necessary, create a separate file association batch file for each user type. So that's it. In two simple steps you can double click your favorite designs and have them open automatically in AutoVue Desktop Deployment. I'd love to hear how are you using AutoVue Desktop Deployment. What other deployment tips would you be interested in learning about?

    Read the article

  • Persist changes in C

    - by Mohit Deshpande
    I am developing a database-like application that stores a a structure containing: struct Dictionary { char *key; char *value; struct Dictionary *next; }; As you can see, I am using a linked list to store information. But the problem begins when the user exits out of the program. I want the information to be stored somewhere. So I was thinking of storing the linked list in a permanent or temporary file using fopen, then, when the user starts the program, retrieve the linked list. Here is the method that prints the linked list to the console: void PrintList() { int count = 0; struct Dictionary *current; current = head; if (current == NULL) { printf("\nThe list is empty!"); return; } printf(" Key \t Value\n"); printf(" ======== \t ========\n"); while (current != NULL) { count++; printf("%d. %s \t %s\n", count, current->key, current->value); current = current->next; } } So I am thinking of modifying this method to print the information through fprintf instead of printf and then the program would just get the infomation from the file. Could someone help me on how I can read and write to this file? What kind of file should it be, temporary or regular? How should I format the file (like I was thinking of just having the key first, then the value, then a newline character)?

    Read the article

  • Blackberry Development, java.lang.outofmemoryerror

    - by Nikesh Yadav
    Hi Forum, I am new to Blackberry development (I am using Eclipse with Blackberry plug-in), I am trying to read a text file, which I placed in the "src" folder of my Blackberry project and this text file just contain a word "Test". when I run the program, I gets "UncaughtException: java.lang.outofmemoryerror". Here is the code I am using, where "speech.txt" is the file I am trying to read and is placed in the "src" folder - public class SpeechMain extends MainScreen { public SpeechMain() { try { Class myClass = this.getClass(); InputStream is = null; is = myClass.getResourceAsStream("speech.txt"); InputStreamReader isr = new InputStreamReader(is); char c; while ((c = (char)isr.read()) != -1) { add(new LabelField("" + c)); } } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); add(new LabelField(e.getMessage())); } } } Thanks in advance. Thanks, Nikesh

    Read the article

  • how can we achieve second application read that file when first application not modifying it

    - by soField
    i have two application first application is bash second is java which one of them is periodically deleting and recreating a specific file (first) the other one is also periodically reading this file and process it in it's own logic (second) how can we achieve second application read that file when first application not modifying it my aim is to force second app read the file only when content of file fully written inside it how can achieve this goal ?

    Read the article

  • SQL SERVER – Attach mdf file without ldf file in Database

    - by pinaldave
    Background Story: One of my friends recently called up and asked me if I had spare time to look at his database and give him a performance tuning advice. Because I had some free time to help him out, I said yes. I asked him to send me the details of his database structure and sample data. He said that since his database is in a very early stage and is small as of the moment, so he told me that he would like me to have a complete database. My response to him was “Sure! In that case, take a backup of the database and send it to me. I will restore it into my computer and play with it.” He did send me his database; however, his method made me write this quick note here. Instead of taking a full backup of the database and sending it to me, he sent me only the .mdf (primary database file). In fact, I asked for a complete backup (I wanted to review file groups, files, as well as few other details).  Upon calling my friend,  I found that he was not available. Now,  he left me with only a .mdf file. As I had some extra time, I decided to checkout his database structure and get back to him regarding the full backup, whenever I can get in touch with him again. Technical Talk: If the database is shutdown gracefully and there was no abrupt shutdown (power outrages, pulling plugs to machines, machine crashes or any other reasons), it is possible (there’s no guarantee) to attach .mdf file only to the server. Please note that there can be many more reasons for a database that is not getting attached or restored. In my case, the database had a clean shutdown and there were no complex issues. I was able to recreate a transaction log file and attached the received .mdf file. There are multiple ways of doing this. I am listing all of them here. Before using any of them, please consult the Domain Expert in your company or industry. Also, never attempt this on live/production server without the presence of a Disaster Recovery expert. USE [master] GO -- Method 1: I use this method EXEC sp_attach_single_file_db @dbname='TestDb', @physname=N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf' GO -- Method 2: CREATE DATABASE TestDb ON (FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf') FOR ATTACH_REBUILD_LOG GO Method 2: If one or more log files are missing, they are recreated again. There is one more method which I am demonstrating here but I have not used myself before. According to Book Online, it will work only if there is one log file that is missing. If there are more than one log files involved, all of them are required to undergo the same procedure. -- Method 3: CREATE DATABASE TestDb ON ( FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf') FOR ATTACH GO Please read the Book Online in depth and consult DR experts before working on the production server. In my case, the above syntax just worked fine as the database was clean when it was detached. Feel free to write your opinions and experiences for it will help the IT community to learn more from your suggestions and skills. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, Readers Question, SQL, SQL Authority, SQL Backup and Restore, SQL Data Storage, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Clean file separators in Ruby without File.join

    - by kerry
    I love anything that can be done to clean up source code and make it more readable.  So, when I came upon this post, I was pretty excited.  This is precisely the kind of thing I love. I have never felt good about ‘file separator’ strings b/c of their ugliness and verbosity. In Java we have: 1: String path = "lib"+File.separator+"etc"; And in Ruby a popular method is: 1: path = File.join("lib","etc") Now, by overloading the ‘/’ operator on a String in Ruby: 1: class String 2: def /(str_to_join) 3: File.join(self, str_to_join) 4: end 5: end We can now write: 1: path = 'lib'/'src'/'main' Brilliant!

    Read the article

  • Scalable distributed file system for blobs like images and other documents

    - by Pinnacle
    Cassandra & HBase both do not efficiently support storage of blobs like images. Storing directly on HDFS stresses the Namenode because of huge number of files. Facebook uses Haystack for images and attachments storage, but this is not open source. So is Lustre a good choice for distributed blob storage? I have read that Amazon S3 is used by many, but this would cost money and personally, I would not like to rely on third party system. What are other suggestions?

    Read the article

  • File property information (last write time and file size) in explorer out of date by hours over netw

    - by David L Morris
    An application is running on a windows XP prof machine picking up file from a network share from another windows machine. It detects that the file has been updated (by date and time or optionally file size) and reads it for any new data. Most of the time the last write time and file size, seems to be up to date. Occasionally, this information stops being updated, even though the file is growing (intermittently during the day) with appended content, so that the last write time and file size remain fixed at some arbitrary moment. This is visible in explorer, where it shows a fixed last write time on the reading machine. Just opening the file to edit it in notepad, immediately refreshes the file properties, and the other application picks up where it left of. The file location can't be changed, nor the location of the relevant applications. Any solutions to resolve this problem?

    Read the article

  • Online File Library

    - by janvdl
    I'm looking for a preferably PHP and web based system, to run on Ubuntu server. Basically it should be a "file forum" in the sense that users can register and be approved to post files to categories. Users with "read" privileges can then go through the categories and download files. Basically I sort of want a FTP-like system, but it should be as easy to manage users, categories, etc as that of a forum system like vB or phpBB. If it could have a forum look and feel that would also be great, but I don't want any discussions taking place.

    Read the article

  • In Eclipse, how to open a file browser in the directory of the currently edited file

    - by JC
    Hi, I know it's possible in eclipse to open file browsers from your project's resource browser, but is it possible for files that aren't part of your project ? Typically external includes are not found in your resource browser... If there is the equivalent of $(resource_loc) for the editor, it would work.. But I wasn't able to find it. Can anyone help me on this ? thanks! JC EDIT : I Found StartExplorer, but it's a joke of a plug-in. It is hardcoded to use WINDOWS explorer or cmd.exe. Also, it still requires you to use the resource browser. Other than that it can open paths selected in the editor, but they must be full paths.

    Read the article

  • Generic file container for quick read of data

    - by DreamCodeR
    Since there are some major privacy issues with alot of social networking sites I am trying to think about alternatives. One is to let the user keep all the information stored in some kind of file container. Now, I haven't found a single type of container that can hold "generic" information. Only for audio/video. What I want is a container that can be read by PHP with some kind of index file that lists up the users pictures in a image/ directory in the container, FOAF files (or some alternative XML-file describing users information and friends, etc.). My thoughts was to let the user keep all their information and data stored in a container that can be imported/exported and deleted from my server (the prototype social networking site I am trying to create), and then uploaded to another site that might use the same format (not that I think that will ever happen, but the user still keeps all their pictures, data, comments, messages, etc). The only thing I have come up with yet is to create a tar archive with the Archive_tar library which extracts and creates Tar-archives with an index-file describing which files are holding the messages (there might be several so each file won't be so large), what pictures are in the image/ folder and what are their names and what comments they have gotten etc. Maybe also the permissions for viewing each type of content. Does there exist any generic file format of a container that I can use to keep all this information in one file with a tree-like index file? Or must i try and create something like this myself?

    Read the article

  • How to mount a HFS partition in Ubuntu as Read/Write?

    - by GiH
    I plugged in my external harddrive (which was formatted on my Mac into HFS+ journaled) to my Ubuntu desktop 9.04 64bit. I am not able to get the drive to mount with write capability, how do I do that? Right now all I'm getting is read access, I tried sudo mount -t hfsplus /dev/sdf2 /media/"Portable HD" but that still gave me only read access... ideas??

    Read the article

  • Photoshop CS6 Corrupted File recovery

    - by Ben Franchuk
    Last night I was working on a client application mock-up in photoshop, but was goin to take a break from my work so I saved the .PSD file on my internal HDD and put my computer into stand-by mode once the file had finished saving. Unfortunately my computer crashed while it was entering stand-by and shut itself down (photoshop was still open). I did not boot it again to make sure all my files were ok because they had already been saved, but today once I opened up the file again it was extremely corrupted and also completely un-editable (screenshot bellow). so what im asking is there any way to recover my work, or at least some of it? i have put in a good few days work on this project and would hate to have to restart it. the size of the file is 3070 KB, even though it reads as 712 KB in photoshop. i dont know if these file sizes are larger or either smaller than the original non-corrupted file's size, but considering all the layers in the file i suspect it was larger before it corrupted. im using windows XP professional 32bit SP3. both my OS and said .PSD file are located on the same internal HDD (74.4 GB). i do have an external HDD (1.5 TB) but i primarily only use it for movies music and tv shows. i dont know if it was plugged in t the time of me editing the document last, though, if it means anything. i have tried many image and PSd recovery softwares but none have returned any results that may help recover my work. edit: i tried using a photo reccovery software (odboso Photorecovery) that actually seems to recover the corrupted file in question judging by the size of the file, but i cannot recover it because of the licence fee. knowing that the file is still likely on my HDD, what location might it be located?

    Read the article

  • Synchronization requirements for FileStream.(Begin/End)(Read/Write)

    - by Doug McClean
    Is the following pattern of multi-threaded calls acceptable to a .Net FileStream? Several threads calling a method like this: ulong offset = whatever; // different for each thread byte[] buffer = new byte[8192]; object state = someState; // unique for each call, hence also for each thread lock(theFile) { theFile.Seek(whatever, SeekOrigin.Begin); IAsyncResult result = theFile.BeginRead(buffer, 0, 8192, AcceptResults, state); } if(result.CompletedSynchronously) { // is it required for us to call AcceptResults ourselves in this case? // or did BeginRead already call it for us, on this thread or another? } Where AcceptResults is: void AcceptResults(IAsyncResult result) { lock(theFile) { int bytesRead = theFile.EndRead(result); // if we guarantee that the offset of the original call was at least 8192 bytes from // the end of the file, and thus all 8192 bytes exist, can the FileStream read still // actually read fewer bytes than that? // either: if(bytesRead != 8192) { Panic("Page read borked"); } // or: // issue a new call to begin read, moving the offsets into the FileStream and // the buffer, and decreasing the requested size of the read to whatever remains of the buffer } } I'm confused because the documentation seems unclear to me. For example, the FileStream class says: Any public static members of this type are thread safe. Any instance members are not guaranteed to be thread safe. But the documentation for BeginRead seems to contemplate having multiple read requests in flight: Multiple simultaneous asynchronous requests render the request completion order uncertain. Are multiple reads permitted to be in flight or not? Writes? Is this the appropriate way to secure the location of the Position of the stream between the call to Seek and the call to BeginRead? Or does that lock need to be held all the way to EndRead, hence only one read or write in flight at a time? I understand that the callback will occur on a different thread, and my handling of state, buffer handle that in a way that would permit multiple in flight reads. Further, does anyone know where in the documentation to find the answers to these questions? Or an article written by someone in the know? I've been searching and can't find anything. Relevant documentation: FileStream class Seek method BeginRead method EndRead IAsyncResult interface

    Read the article

  • SQL SERVER – Read Only Files and SQL Server Management Studio (SSMS)

    - by pinaldave
    Just like any other Developer or DBA SQL Server Management Studio is my favorite application. Any any moment of the time I have multiple instances of the same application are open and I am working on it. Recently, I have come across a very interesting feature in SSMS related to “Read Only” files. I believe it is a little unknown feature as well so decided to write a blog about the same. First create a read only SQL file. You can make any file read by Right Click >> Properties >> Select Attribute Read Only. Now open the same file in SQL Server Management Studio. You will find that besides the file name there is a small ‘lock’ icon. This small icon indicates that the file is read only. Now let us attempt to edit the read only file. It will let us edit the file any way we want, however when we attempt to save it, it gives following pop-up value. The options in the pop-up are self explanatory and I liked it. The goal of the read only file is to prevent users to make un-intended changes. However, when a user should have complete control over the user file. User should be aware that the file is read only but if he wants to edit the file or save as a new file the choices should be present in front of it and the pop-up menu precisely captures the same. Now let us check option related to this feature in SSMS. Go to Menu >> Options >> Environment >> Documents You will find the third option which is “Allow editing of read-only files; warn when attempt to save”. In the above scenario it was already checked. Let us uncheck the same and do the same exercise which we have done earlier. I closed all the earlier window to avoid confusion. With the new option selected when I attempt to even modify the Read Only file, it gives me totally different pop up screen. It gives me an option like “Edit In-Memory”, “Make Writeable” etc. When you select “Edit In-Memory” it allows you to edit the file and later you can save as new file – just like the earlier scenario which we have discussed. . If clicked on the Make Writeable it will remove the restriction of the Read Only and file can be edited as pleased. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Problem opening SFX archive file(.exe) using the archive manager

    - by Cody
    I have installed both rar and unrar using apt-install but I am still not able to use archive manager for opening the archive file.. I have also tried installing p7zip(p7zip-full and p7zip) but no improvements... However, when I use command-line for extracting the files from the archive using unrar or rar the command executes successfully... Is there any other open source software I should install for viewing the contents of the SFX archive or what else should I install to view the same in the archive manager.. Thanks in advance...

    Read the article

  • [VB.Net] System.IO will copy files, but fails to update destinations file attributes

    - by CFP
    Hello, I have a little vb.net script that will copy a file, set its attributes to Normal, update the file time, and then set back the attributes to match those of the source file. If IO.File.Exists(Destination) Then IO.File.SetAttributes(Destination, IO.FileAttributes.Normal) IO.File.Copy(Source, Destination, True) IO.File.SetAttributes(Destination, IO.FileAttributes.Normal) IO.File.SetLastWriteTimeUtc(Destination, IO.File.GetLastWriteTimeUtc(Destination).AddHours(1)) IO.File.SetAttributes(Destination, IO.File.GetAttributes(Source)) I however I'm encountering a quite strange problem. On some configurations, IO.File.SetLastWriteTimeUtc triggers an UnauthorizedAccess error, although the IO.File.Copy instruction worked very well. I'm totally puzzled: I've checked, and file attributes are set to 128 (ie. Normal) successfully. The problem seems to be with the very SetLastWriteTimeUtc. But what is it? Any ideas? Thanks a lot!

    Read the article

  • Is it safe to convert Windows file paths to Unix file paths with a simple replace?

    - by MxyL
    So for example say I had it so that all of my files will be transferred from a windows machine to a unix machine as such: C:\test\myFile.txt to {somewhere}/test/myFile.txt (drive letter is irrelevant at this point). Currently, our utility library that we wrote ourselves provides a method that does a simple replace of all back slashes with forward slashes: public String normalizePath(String path) { return path.replaceAll("\\", "/"); } Slashes are reserved and cannot be part of a file name, so the directory structure should be preserved. However, I'm not sure if there are other complications between windows and unix paths that I may need to worry about (eg: non-ascii names, etc)

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >