Search Results

Search found 49518 results on 1981 pages for 'configuration files'.

Page 295/1981 | < Previous Page | 291 292 293 294 295 296 297 298 299 300 301 302  | Next Page >

  • PHP: Problem with reading Files

    - by mathiregister
    hello guys, i wonder what i'm doing wrong! i want to read a folder and run through the existing files, checking if they are either images or textfiles. if there are textfiles they should be put into a div, if there are images the should be output as an image. <?php $path = 'thumbs'; if ($handle = opendir($path)) { while (false !== ($file = readdir($handle))) { $ext = pathinfo($file, PATHINFO_EXTENSION); if ($ext == "jpg" || $ext == "jpeg" || $ext == "gif" || $ext == "png") { print "<img class='thumb' src='$path/$file'/>"; } else if ($ext == "txt" || $ext == "rtf") { //read text $lines = file_get_contents($file); $lines = str_replace("\n","<br/>",$lines); print "<div class='text'>" . $lines . "</div>"; //read text } } closedir($handle); } ?> there seems to be a problem i can't find, because ALL IMAGES get put out, however only ONE of a few textfiles gets printed. Any ideas why it only prints out one textfile??? thank you for your help!

    Read the article

  • Combining two exe files

    - by Sophia
    Here's some background to my problem: I have a project in Visual C++ 2006 and a project in in Visual C++ 2010 Express. Both compiles to form an exe file each. I cannot convert my 2006 project to 2010 because I get a lot of "unable to load project" errors. I also cannot port my 2010 project code to 2006 (I always get errors no matter what I try, something to do with libraries). My final solution requires me to only have ONE executable. Is there anything I can do to achieve that? I've done some quick search on Google and found there to be exe joiners, but I've also heard that those things are often used to make malware. For reference, I am working with "dummy" clients, and therefore want to simplify things on their end as much as possible. Thus, having them executing one exe is better than having them execute two. Also, I do not wish for their antivirus to go haywire because I used some program to join two exe together. What do? Edit: The two project files do different things. For example, the project in VS2006 one sets up a server, and the project in VS2010 one grabs info on the user's OS. The code for the "server", I think, has a lot of dependencies and for some reason cannot convert to Visual c++ 2010. The code for "grabbing" requires some newer libraries and compiling options, and would not work if I port to 2006.

    Read the article

  • list all files from directories and subdirectories in Java

    - by Adnan
    What would be the fastest way to list the names of files from 1000+ directories and sub-directories? EDIT; The current code I use is: import java.io.File; public class DirectoryReader { static int spc_count=-1; static void Process(File aFile) { spc_count++; String spcs = ""; for (int i = 0; i < spc_count; i++) spcs += " "; if(aFile.isFile()) System.out.println(spcs + "[FILE] " + aFile.getName()); else if (aFile.isDirectory()) { System.out.println(spcs + "[DIR] " + aFile.getName()); File[] listOfFiles = aFile.listFiles(); if(listOfFiles!=null) { for (int i = 0; i < listOfFiles.length; i++) Process(listOfFiles[i]); } else { System.out.println(spcs + " [ACCESS DENIED]"); } } spc_count--; } public static void main(String[] args) { String nam = "D:/"; File aFile = new File(nam); Process(aFile); } }

    Read the article

  • How to recover data files from xampp-windows to xampp-linux after crash?

    - by David Buehler
    My Windows box died after I developed a database in xampp on it; fortunately I have a backup of the entire F:/TestWeb/Xampp partition. Unfortunately, I did not do an Export (nor dump) of the "Lws2" database before the crash. I have replaced the defunct machine with one running Mint7 (based on Ubuntu 9.04 "Jaunty Jackalope") and installed xampp-linux into the /opt partition, so the new xampp now runs fine in /opt/lampp, and says all the elements are secured by passwords (which I just assigned during this installation.) I assumed that Xamp-Windows installed in November would migrate easily to xampp-linux installed iin February -- a bad assumption. It apparently would have been simple if I had known enough to do an Export or a Dump before the crash, but.... The backup was done to a Network Attached Storage drive, which is formatted as "vfat" so the backup does not carry with it any valid ownership permissions from MySql on NTFS. I now see from my backup that the old data resided in \TestWeb\Xampp\Mysql\Data\Lws2\ and consists of 7 ".frm" files which define my tables. The actual data -- I suppose a ".sql" file or files -- has disappeared, and I am resigning myself to two days of retyping it. But I do not wish to do the table layouts all over again. So I copied Data tree to /opt/lampp/Data -- PhpMyAdmin does not see it. So I copied Lws2 tree to /opt/lampp/Lws2 -- PhpMyAdmin does not see it. So I copied Data tree to /opt/lampp/var/mysql/Data -- PhpMyAdmin does not see it. So I copied Lws2 tree to /opt/lampp/var/mysql/Lws2 -- PhpMyAdmin does not see it. So I adjusted all the permissions to stop saying owner "nobody" to owner "root" and gave full permissions to all groups and to all others, with permissions percolating down, in all 4 trees. You guessed it -- PhpMyAdmin does not see any database named Lws2, only its 4 default ones. I double-checked the permissions and rebooted Linux and repeated the tests. At some point in that process I did see PhpMyAdmin showing "lws2(7)" but when I clicked on it I saw a "no table found" message. I have not been able to recreate that experience. Apparently there are some setup files for MySql and for PhpMyAdmin which need to be set up by running a wizard or two or by editing the files directly. I grepped the TestWeb tree and found an old "ldir = "C:TestWeb\Xampp\MySql\" and a "DataDir = C:TestWeb\Xampp\MySql\" in a .php file and in a .bat file, but I cannot find the corresponding config file names on the /opt partition/ -- so it looks as if these wizards have not been run to create them. What config files files does Linux use to setup MySql config files for PhpMyAdmin? What wizards do I need to run to point the MySql engine and the PhpMyAdmin at the folder /opt/lampp/data/ with its lws2 folder inside it? Or which files do I need to edit, with a sample of what it normally says under Linux? Incidentally, I remember I converted from MyISAM with its .MYD and .MYI files to InnoDB after entering only a small amount of the data -- and I do not know what file types to look for -- perhaps my data is still there but under another guise or in another place? Is it something as simple as linux needing to see "/data/" instead of /Data? I will check that out while waiting for a response. If anyone can point me to documentation that discusses this level of detail -- I will read it avidly! In any case, thanks for any clarification you can give on this thorny problem. wizdum

    Read the article

  • MSBuild appears to only use old output files for custom build tools

    - by sixlettervariables
    I have an ANTLR grammar file as part of a C# project file and followed the steps outlined in the User Manual. <Project ...> <PropertyGroup> <Antlr3ToolPath>$(ProjectDir)tools\antlr-3.1.3\lib</Antlr3ToolPath> <AntlrCleanupPath>$(ProjectDir)AntlrCleanup\$(OutputPath)</AntlrCleanupPath> </PropertyGroup> <ItemGroup> <Antlr3 Include="Grammar\Foo.g"> <OutputFiles>FooLexer.cs;FooParser.cs</OutputFiles> </Antlr3> <Antlr3 Include="Grammar\Bar.g"> <OutputFiles>BarLexer.cs;BarParser.cs</OutputFiles> </Antlr3> </ItemGroup> <Target Name="GenerateAntlrCode" Inputs="@(Antlr3)" Outputs="%(Antlr3.OutputFiles)"> <Exec Command="java -cp %22$(Antlr3ToolPath)\antlr-3.1.3.jar%22 org.antlr.Tool -message-format vs2005 @(Antlr3Input)" Outputs="%(Antlr3Input.OutputFiles)" /> <Exec Command="%22$(AntlrCleanupPath)\AntlrCleanup.exe%22 @(Antlr3Input) %(Antlr3Input.OutputFiles)" /> </Target> <ItemGroup> <!-- ...other files here... --> <Compile Include="Grammar\FooLexer.cs"> <AutoGen>True</AutoGen> <DesignTime>True</DesignTime> <DependentUpon>Foo.g</DependentUpon> </Compile> <Compile Include="Grammar\FooParser.cs"> <AutoGen>True</AutoGen> <DesignTime>True</DesignTime> <DependentUpon>Foo.g</DependentUpon> </Compile> <!-- ... --> </ItemGroup> </Project> For whatever reason, the Compile steps only use old versions of the code, no amount of tweaking appears to help.

    Read the article

  • client side validation in ascx files (user controls) for asp.net mvc

    - by Sefer KILIÇ
    hi, I have a logOn forn in ascx files and I render it as partial. How I can add a clinet side validation to this form, have any idea ? My below code does not work <%= Html.ValidationSummary(true, "Giris basarisiz oldu. Lütfen hatalari düzeltip tekrar deneyin.") %> <% Html.EnableClientValidation(); %> <% using (Html.BeginForm("LogOnProcess", "Account")) { %> <div> <fieldset> <legend>Hesap Bilgileri</legend> <div class="editor-label"> <%= Html.LabelFor(m => m.UserName) %> </div> <div class="editor-field"> <%= Html.TextBoxFor(m => m.UserName) %> <%= Html.ValidationMessageFor(m => m.UserName) %> </div> <div class="editor-label"> <%= Html.LabelFor(m => m.Password) %> </div> <div class="editor-field"> <%= Html.PasswordFor(m => m.Password) %> <%= Html.ValidationMessageFor(m => m.Password) %> </div> <div class="editor-label"> <%= Html.CheckBoxFor(m => m.RememberMe) %> <%= Html.LabelFor(m => m.RememberMe) %> </div> <p> <input type="submit" value="Giris" /> </p> </fieldset> </div> <% } %>

    Read the article

  • Should I deal with files longer than MAX_PATH?

    - by John
    Just had an interesting case. My software reported back a failure caused by a path being longer than MAX_PATH. The path was just a plain old document in My Documents, e.g.: C:\Documents and Settings\Bill\Some Stupid FOlder Name\A really ridiculously long file thats really very very very..........very long.pdf Total length 269 characters (MAX_PATH==260). The user wasn't using a external hard drive or anything like that. This was a file on an Windows managed drive. So my question is this. Should I care? I'm not saying can I deal with the long paths, I'm asking should I. Yes I'm aware of the "\?\" unicode hack on some Win32 APIs, but it seems this hack is not without risk (as it's changing the behaviour of the way the APIs parse paths) and also isn't supported by all APIs . So anyway, let me just state my position/assertions: First presumably the only way the user was able to break this limit is if the app she used uses the special Unicode hack. It's a PDF file, so maybe the PDF tool she used uses this hack. I tried to reproduce this (by using the unicode hack) and experimented. What I found was that although the file appears in Explorer, I can do nothing with it. I can't open it, I can't choose "Properties" (Windows 7). Other common apps can't open the file (e.g. IE, Firefox, Notepad). Explorer will also not let me create files/dirs which are too long - it just refuses. Ditto for command line tool cmd.exe. So basically, one could look at it this way: a rouge tool has allowed the user to create a file which is not accessible by a lot of Windows (e.g. Explorer). I could take the view that I shouldn't have to deal with this. (As an aside, this isn't an vote of approval for a short max path length: I think 260 chars is a joke, I'm just saying that if Windows shell and some APIs can't handle 260 then why should I?). So, is this a fair view? Should I say "Not my problem"? Thanks! John

    Read the article

  • IE8 rendering of local-files is wrong

    - by Eric
    It appears that IE8 is not rendering properly a local file: Consider this simple webpage: http://sayang.free.fr/ie8render.html (html code below) extracted from a w3c tutorial on opacity. Save it locally and display it again: the local file has no opacity! That's very annoying, especially when one wants to design complex pages on prototypes placed in local files. Do you have a solution to that ? <html> <head> <title>IE8 Local File</title> <meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1" /> <meta http-equiv="pragma" content="no-cache" /> <meta http-equiv="cache-control" content="no-cache" /> <meta http-equiv="expires" content="-1" /> <style type="text/css"> div.background { width: 500px; height: 250px; background: url(http://www.w3schools.com/css/klematis.jpg) repeat; border: 2px solid black; } div.transbox { width: 400px; height: 180px; margin: 30px 50px; background-color: #ffffff; border: 1px solid black; /* for IE */ filter:alpha(opacity=60); /* CSS3 standard */ opacity:0.6; } div.transbox p { margin: 30px 40px; font-weight: bold; color: #000000; } </style> </head> <body> <h2>Save this file locally and open it to see the difference</h2> <div class="background"> <div class="transbox"> <p>This is some text that is placed in the transparent box. This is some text that is placed in the transparent box. This is some text that is placed in the transparent box. This is some text that is placed in the transparent box. This is some text that is placed in the transparent box.</p> </div> </div> </body> </html>

    Read the article

  • xerces-c: Xml parsing multiple files

    - by user459811
    I'm atempting to learn xerces-c and was following this tutorial online. http://www.yolinux.com/TUTORIALS/XML-Xerces-C.html I was able to get the tutorial to compile and run through a memory checker (valgrind) with no problems however when I made alterations to the program slightly, the memory checker returned some potential leak bytes. I only added a few extra lines to main to allow the program to read two files instead of one. int main() { string configFile="sample.xml"; // stat file. Get ambigious segfault otherwise. GetConfig appConfig; appConfig.readConfigFile(configFile); cout << "Application option A=" << appConfig.getOptionA() << endl; cout << "Application option B=" << appConfig.getOptionB() << endl; // Added code configFile = "sample1.xml"; appConfig.readConfigFile(configFile); cout << "Application option A=" << appConfig.getOptionA() << endl; cout << "Application option B=" << appConfig.getOptionB() << endl; return 0; } I was wondering why is it when I added the extra lines of code to read in another xml file, it would result in the following output? ==776== Using Valgrind-3.6.0 and LibVEX; rerun with -h for copyright info ==776== Command: ./a.out ==776== Application option A=10 Application option B=24 Application option A=30 Application option B=40 ==776== ==776== HEAP SUMMARY: ==776== in use at exit: 6 bytes in 2 blocks ==776== total heap usage: 4,031 allocs, 4,029 frees, 1,092,045 bytes allocated ==776== ==776== 3 bytes in 1 blocks are definitely lost in loss record 1 of 2 ==776== at 0x4C28B8C: operator new(unsigned long) (vg_replace_malloc.c:261) ==776== by 0x5225E9B: xercesc_3_1::MemoryManagerImpl::allocate(unsigned long) (MemoryManagerImpl.cpp:40) ==776== by 0x53006C8: xercesc_3_1::IconvGNULCPTranscoder::transcode(unsigned short const*, xercesc_3_1::MemoryManager*) (IconvGNUTransService.cpp:751) ==776== by 0x4038E7: GetConfig::readConfigFile(std::string&) (in /home/bonniehan/workspace/test/a.out) ==776== by 0x403B13: main (in /home/bonniehan/workspace/test/a.out) ==776== ==776== 3 bytes in 1 blocks are definitely lost in loss record 2 of 2 ==776== at 0x4C28B8C: operator new(unsigned long) (vg_replace_malloc.c:261) ==776== by 0x5225E9B: xercesc_3_1::MemoryManagerImpl::allocate(unsigned long) (MemoryManagerImpl.cpp:40) ==776== by 0x53006C8: xercesc_3_1::IconvGNULCPTranscoder::transcode(unsigned short const*, xercesc_3_1::MemoryManager*) (IconvGNUTransService.cpp:751) ==776== by 0x40393F: GetConfig::readConfigFile(std::string&) (in /home/bonniehan/workspace/test/a.out) ==776== by 0x403B13: main (in /home/bonniehan/workspace/test/a.out) ==776== ==776== LEAK SUMMARY: ==776== definitely lost: 6 bytes in 2 blocks ==776== indirectly lost: 0 bytes in 0 blocks ==776== possibly lost: 0 bytes in 0 blocks ==776== still reachable: 0 bytes in 0 blocks ==776== suppressed: 0 bytes in 0 blocks ==776== ==776== For counts of detected and suppressed errors, rerun with: -v ==776== ERROR SUMMARY: 2 errors from 2 contexts (suppressed: 2 from 2)

    Read the article

  • Cannot Call WordPress Plugin Files Under wp-content

    - by Volomike
    I have a client who has many blog customers. Each of these WordPress blogs calls a plugin that provides a product link. The way that link is composed looks like this: {website}/wp-content/plugins/prodx/product?id=432320 This works fine on all blogs except two. On those two, when you try to call the URL, you get a 404. So, I disabled all plugins except prodx and reverted the theme to default (Kubrick), thinking perhaps a plugin intercept with add_action() API was doing this, such as intercepting URLs and redirecting them. However, this did not help. So, I upgraded the WordPress to the latest version. Again, didn't fix. So, I checked permissions, comparing with a blog that worked just fine. Again, didn't fix. So I replaced the .htaccess, using one from a working blog. Again, didn't fix. So I replaced all the files using some from a working blog that was identical to this one, and then restored the wp-config.php file back so that it talked to the right blog database. Again, didn't fix. Again I checked permissions meticulously, comparing to a perfectly working blog. Again, didn't fix. So, I created a test.php that looks like so: <?php print_r($_GET); echo "hello world"; I then copied it into another plugin folder and used my browser to get to it -- again, 404. So I copied it into the root of wp-content/plugins and tried to call it there -- again, 404. So I copied it into wp-content -- again, 404. Last, I copied it into the root of the WordPress blog website, and this time, it worked! Doesn't make sense. I started to think that perhaps something was going on with /etc/httpd/conf/httpd.conf for this customer, but the only thing I saw different in their for this customer was the IP address was different than the customer's blog that worked. Each customer gets their own IP in this environment my client has built. My client sysop is baffled too. What do you think is going on? Is there something wrong in the WP database for this customer? Is there something wrong in httpd.conf?

    Read the article

  • Planning to create PDF files in Ruby on Rails

    - by deau
    Hi there, A Ruby on Rails app will have access to a number of images and fonts. The images are components of a visual layout which will be stored separately as a set of rules. The rules specify document dimensions along with which images are used and where. The app needs to take these rules, fetch the images, and generate a PDF that is ready for local printing or emailing. The fonts will also be important. The user needs to customize the layout by inputting text which will be included in the PDF. The PDF must therefore also contain the desired font so that the document renders identically across different machines. Each PDF may have many pages. Each page may have different dimensions but this is not essential. Either way, the ability to manipulate the dimensions and margins given by the PDF is essential. The only thing that needs to be regularly changed is the text. If this is takes too much development then the app can store the layouts in 3rd party PDFs and edit the textual content directly. Eventually though, this will prove too restrictive on the apps intended functionality so I would prefer the app to generate the PDF's itself. I have never worked with PDFs before and, for the most part, I've never had to output anything to the user outside their monitor. A printed medium could require a very different approach to get the best results. If anyone has any advice on how to model the PDF format this it would be really appreciated. The technical aspects of printing such as bleed, resolution and colour have already been factored in to the layouts and images. I am aware that PDF is a proprietary file format and I want to use free or open source software. I have seen a number of Ruby libraries for generating PDF files but because I am new on this scene I have no way to reliably compare them and too little time to implement and test them all. I also have the option of using C to handle this feature and if this is process intensive then that might be preferred. What should I be thinking about and how should I be planning to implement this?

    Read the article

  • [Delphi] open text files in one application

    - by Remus Rigo
    hi all I want to write an text editor and to assign the txt files to it. My problem is that I want to have only one instance running and when a new file is opened to send the filename to the first app that is already running... (I want to do this using mutex). Here is a small test DPR looks like this uses Windows, Messages, SysUtils, Forms, wndMain in 'wndMain.pas' {frmMain}; {$R *.res} var PrevWindow : HWND; S : string; CData : TCopyDataStruct; begin PrevWindow := 0; if OpenMutex(MUTEX_ALL_ACCESS, False, 'MyMutex') <> 0 then begin repeat PrevWindow:=FindWindow('TfrmMain', nil); until PrevWindow<>Application.Handle; if IsWindow(PrevWindow) then begin SendMessage(PrevWindow, WM_SYSCOMMAND, SC_RESTORE, 0); BringWindowToTop(PrevWindow); SetForegroundWindow(PrevWindow); if FileExists(ParamStr(1)) then begin S:=ParamStr(1); CData.dwData:=0; CData.lpData:=PChar(S); CData.cbData:=1+Length(S); SendMessage(PrevWindow, WM_COPYDATA, 0, DWORD(@CData) ); end; end; end else CreateMutex(nil, False, 'MyMutex'); Application.Initialize; Application.CreateForm(TfrmMain, frmMain); Application.Run; end. PAS: type TfrmMain = class(TForm) memo: TMemo; private procedure WMCopyData ( var msg : TWMCopyData ) ; message WM_COPYDATA; public procedure OpenFile(f : String); end; var frmMain: TfrmMain; implementation {$R *.dfm} procedure TfrmMain.WMCopyData ( var msg : TWMCopyData ) ; var f : String; begin f:=PChar(msg.CopyDataStruct.lpData); //ShowMessage(f); OpenFile(f); end; procedure TfrmMain.OpenFile(f : String); begin memo.Clear; memo.Lines.LoadFromFile(f); Caption:=f; end; this code should be ok, but if i want to open a text file (from the second app), the first app receives a message like this: thanks

    Read the article

  • How do I change the location of DivX cache files?

    - by andygrunt
    I recently installed to the latest version of DivX and suddenly found my C drive filling up with the cache files. I tracked it down to: C:\Files\My Videos\DivX Movies\Temporary Downloaded Files My old laptop (running WinXP) only has a small hard drive and any DivX cache files fills it up so I want it to use my D drive where I have a little more room. The trouble is I can't see anywhere in the DivX preferences where I can change the cache location. Can anyone tell me how I can change the location of the DivX cache files?

    Read the article

  • Why have my Aperture referenced files disappeared without trace?

    - by Andy Harvey
    I have an Aperture library that I've been using for several years. In the past week a number of recent photos and movies have mysteriously disappeared. The missing files were all safely imported within the last week (I had reviewed some of the movies within Aperture, so know this to be true). The files were all referenced on an external drive. Within Aperture the files now have a "referenced file can't be found" icon. I've tried searching for the missing files manually, including within the Aperture library package, but they cannot be found anywhere. How can I (a) work out where the missing files have gone, and (b) identify the cause and ensure it doesn't happen again? I'm using Aperture 3.3.2.

    Read the article

  • Mac OS X software always order files alphabetically rather than by type.

    - by george
    I have noticed many Mac applications sort the files alphabetically rather than by type. A good example would be Coda by panic.com. The files in the file menu are organized alphabetically. I requested for them to add the feature to organize files by type, and they've said that it's a Finder thing. So I looked at other applications to see if they were organizing by type. I noticed Dreamweaver CS4 had this same problem and now including Dreamweaver CS5. There has to be something in the Mac that does this and that I can modify. I played with Spotlight and it now displays its files by type (thinking that's what I can do) but it didn't take effect in other applications. What library are these applications using to display a file menu for their files?

    Read the article

  • what software will rip a CD to flac files in one step?

    - by jcollum
    I'm using EAC to rip files to FLAC. Seems fine. But for a large amount of CDs (200 or so) the process is a little labor intensive. There's 4 or 5 clicks in there that seem extraneous -- EAC seems to want to rip the CD to a set of WAV files then (after all those files are ripped) I have to select "Process WAV files" (I'd get it exact, but EAC is occupied right now). I'd like to send all of it to FLAC files in just a few steps: get cddb info, select art, rip. Is there a way to do this with EAC that I'm missing? Haven't been able to find anything on the web that explains this. Is there a better program for this?

    Read the article

  • Batch OCR for many PDF files (not already OCRed) ?

    - by David
    Hello, I use Google Desktop Search (I am on Vista) and not all my PDF files are recognized in my archive folder. It is normal as "PDF files that contain scanned images" are not indexed (http://desktop.google.com/support/bin/answer.py?hl=en&answer=90651) So I would like to OCR many of my PDF files that are not already OCRed. My goal : I give the program a folder and it search alone in the subfolders the PDF files that need to be converted into PDF-OCRed files. Note: In the past, if a PDF file was password protected, I removed the password with another batch (paying) tool: verypdf.com "pwdremover" Any (not too much expensive) idea ? I already tried : Finereader 6 pro on xp at the time, but there was no batch processor included... Paperfile paperfile.net which uses Tesseract code.google.com/p/tesseract-ocr/. But the OCR is only PDF to text, not PDF to PDF! There is also another project code.google.com/p/ocropus Thanks in advance ;)

    Read the article

  • What does this mean: "SATP VMW_SATP_LOCAL does not support device configuration"?

    - by Jason Tan
    Can anyone tell me what this means in ESXi 5.1?: SATP VMW_SATP_LOCAL does not support device configuration I've googled it and I get a lot of results, but as yet all the pages that contain the string are discussing other matters. The storage array is a HDS HUS-VM and the hosts are HP b460c G8 blades with flex fabric and flex fabric VCs which I am in the process of commissioning and would like to get it started on the right foot - i.e. error and warning free! naa.600508b1001c56ee3d70da65f071da23 Device Display Name: HP Serial Attached SCSI Disk (naa.600508b1001c56ee3d70da65f071da23) Storage Array Type: VMW_SATP_LOCAL Storage Array Type Device Config: SATP VMW_SATP_LOCAL does not support device configuration. Path Selection Policy: VMW_PSP_FIXED Path Selection Policy Device Config: {preferred=vmhba0:C0:T0:L1;current=vmhba0:C0:T0:L1} Path Selection Policy Device Custom Config: Working Paths: vmhba0:C0:T0:L1 Is Local SAS Device: true Is Boot USB Device: false This is the same LUN: ~ # esxcli storage core device list -d naa.60060e80132757005020275700000016 naa.60060e80132757005020275700000016 Display Name: HITACHI Fibre Channel Disk (naa.60060e80132757005020275700000016) Has Settable Display Name: true Size: 204800 Device Type: Direct-Access Multipath Plugin: NMP Devfs Path: /vmfs/devices/disks/naa.60060e80132757005020275700000016 Vendor: HITACHI Model: OPEN-V Revision: 5001 SCSI Level: 2 Is Pseudo: false Status: degraded Is RDM Capable: true Is Local: false Is Removable: false Is SSD: false Is Offline: false Is Perennially Reserved: false Queue Full Sample Size: 0 Queue Full Threshold: 0 Thin Provisioning Status: unknown Attached Filters: VAAI_FILTER VAAI Status: supported Other UIDs: vml.020001000060060e801327570050202757000000164f50454e2d56 Is Local SAS Device: false Is Boot USB Device: false ~ #

    Read the article

  • What files should be excluded from a complete Windows backup?

    - by tro
    I'm starting to use CrashPlan to backup my Win 7 PC. I've got it writing to my external HD (for quick local restores) and to CrashPlan Central (for offsite storage). I'd like to backup my entire C:\ drive (the only partition) in a way that: Preserves all of my installed software and configuration, but Avoids backing up log files and other ephemeral / temporary files that are regenerated during normal operation of the OS. Which files and/or directories should I be excluding from backups? I'd like to make this a community wiki, so that we could all contribute towards a definitive list. Here's a list of regular expressions identifying the directories and files that CrashPlan excludes on Windows by default listed at http://support.crashplan.com/doku.php/articles/admin_excludes: .*/(?:42|\d{8,})/(?:cp|~).* (?i).*/CrashPlan.*/(?:cache|log|conf|manifest|upgrade)/.* .*\.part .*/iPhoto Library/iPod Photo Cache/.* .*\.cprestoretmp.* *\.rbf :/Config\\.Msi.* .*/Google/Chrome/.*cache.* .*/Mozilla/Firefox/.*cache.* .*\$RECYCLE\.BIN/.* .*/System Volume Information/.* .*/RECYCLER/.* .*/I386.* .*/pagefile.sys .*/MSOCache.* .*UsrClass\.dat\.LOG .*UsrClass\.dat .*/Temporary Internet Files/.* (?i).*/ntuser.dat.* .*/Local Settings/Temp.* .*/AppData/Local/Temp.* .*/AppData/Temp.* .*/Windows/Temp.* (?i).*/Microsoft.*/Windows/.*\.log .*/Microsoft.*/Windows/Cookies.* .*/Microsoft.*/RecoveryStore.* (?i).:/Config\\.Msi.* (?i).*\\.rbf .*/Windows/Installer.* Other excludes: .*\.(class|obj) .*/hiberfil.sys (?i).*\.tmp (?i).*/temp/ (?i).*/tmp/ .*Thumbs\.db .*/Local Settings/History/ .*/NetHood/ .*/PrintHood/ .*/Cookies/ .*/Recent/ .*/SendTo/

    Read the article

  • I used my ntfs formatted hard drive, copied files from a mac using a bridge program, but now i can't

    - by Marc
    I used my ntfs formatted hard drive, copied files from a mac using a bridge program, but now i can't see any of the files on my pc help? I brought it from home, (NTFS HD) found that it was only read only.. so i got a bridge program (don't remember) and was able to copy files from the mac to the hard drive. I took it home and went to get into those files, and they do not even show up.. it copied, and did everything it was supposed to do to let me know it was working, but now i can't find any of those files. am I FUBAR? :(

    Read the article

  • How can I setup my local Nginx server so I can edit the files?

    - by Shane Grant
    I have my local development machine running Arch Linux, Nginx, PHP-FPM and MySQL. In order for the websites I am working on to run the files need to be owned by the http user. The files are currently located in folders like this: /srv/http/site1/ /srv/http/site2/ When I use the following chown command on the http folder the sites work fine, but I cannot edit the files with my user: chown -R http.users /srv/http When I do this the sites do not work, but I can edit the files: chown -R shane.http /srv/http How can I make it so that my user can edit the files, and the web server can run them at the same time? Thank you

    Read the article

  • Unable to reach files in subfolder with domain name in path in IIS 5.

    - by Chuck Conway
    In IIS 5 files in the url: http://acme.com/_cache/cache-www.acme.com/v3.css are not accessible. All files below "cache-www.acme.com" are unreachable. I've verified that the files exists. Permissions are not a problem. I've assigned "Everyone" to the files and give "Everyone" full rights. What I have determined is in IIS 5 if there is a domain in the folder path, IIS 5 gets confused... Other javascript files outside the directory comedown fine... Any thoughts?

    Read the article

  • is there a way to prespecify to overwrite files with same name?

    - by Celeritas
    Connections to network drives are ridiculously slow (e.g. 15kb/sec on really good days) and when I'm copying files I leave my desk. My problem is when there is a file with the same name to be overwritten, is there a way to specify in advanced to overwrite files? I know it has the option "do same for next x conflicts" but that doesn't popup until (in some cases) a long time after the files start copying. See my dilemma? Example: copying 500 files, estimated time 2 hours, I leave, after 10 minutes message comes up about file with same name and asks if it should overwrite (in this time copying stalls), I come back 30 minutes latter to find only the files in the 10 minutes copied. Out of curiosity how could the network speed be so bad? I asked the boss and he said because it gets routed around a lot and is just bad :(

    Read the article

  • suPHP permission requirements for all files or only PHP scripts?

    - by puk
    Generally with PHP, files/folders have a permission of 0777 when we want to write to them. suPHP forbids a permission of 0777. Instead, files are supposed to have a permission of 0644 and folders have a permission 0755. However, this is always worded differently SuPHP won't allow chmod of 777 so we need to set all files to 644 and directories to 755 Your scripts and directories can now, only have a maximum of 755 permissions Set file permission(s) to allow read access only by you (e.g., chmod 600 filename.suphp). What is the exact rule and to what files does it apply to? If I have README file somewhere in a nested directory does its permission need to be updated, or does it only apply to .php files?

    Read the article

  • In CentOS 4.3 Webmin 1.3000 bandwidth monitoring is eating disk space. How to delete those files?

    - by Silkograph
    I maintain Linux server being used for Mail, Squid and DNS service. Recently I observed that something was eating server disk space. But at last, today I caught the culprit which was consuming the disk by storing large number of files. On this server, Webmin 1.300 is installed. We use Squid proxy and Sarg to monitor Internet access. I always manually clear Sarg generated files under /var/www/html/squid for last few years. But I never realized that Webmin is also storing some kind of bandwidth log files in its' directory structure. I have noticed that under /etc/webmin/bandwidth/hours it has stored more thousands of files since year 2007 totaling about 17 GB. We have used 40 GB HDD for this server machine. My question is how can I delete those (/etc/webmin/bandwidth/hours) files safely?

    Read the article

< Previous Page | 291 292 293 294 295 296 297 298 299 300 301 302  | Next Page >