Search Results

Search found 20313 results on 813 pages for 'batch size'.

Page 342/813 | < Previous Page | 338 339 340 341 342 343 344 345 346 347 348 349  | Next Page >

  • how to i lock a Harddrive to a certain drive letter

    - by Memor-X
    i have an external hard drive that i have set to Drive F which on it contains some programs that i have shortcuts on the desktop how i have a second external hard drive which i store my music on which is auto assigned to Drive E, due to someone thinking that Australians love to have f** wings on power plugs for hard drives while power boards have each socket close together i can have both of these hard drives set at the same time my music hard drive i normally only plug up when i sync music to my ipod but on occasion when i unplug my music hard drive and plug my old one back in or at times when i turn on my computer with the music hard drive in, turn off my computer and turn it back on with my old hard drive it's drive letter gets switched to E i get annoyed having to always go into disk management and change the drive letter back to F when this happens so i am wondering if I can lock my hard drive to always be F, if anything else tries to be F it can fail for all i care If there is a batch file i can use that'll go though all the steps of Disk Management to change a drive's letter, that way i can set it up in the startup folder

    Read the article

  • Convert png sequence to x264 with ffmpeg

    - by Thucydides411
    I am trying to convert a series of pngs into an mp4 video. I am using ffmpeg, and want to encode the video with the x264 codec. Using the command ffmpeg -y -r 30 -b 1800k -i _tmp%04d.png -vcodec libx264 out.mp4 I get the following warning message Incompatible pixel format 'bgra' for codec 'libx264', auto-selecting format 'yuv420p' My understanding is that there is an alpha channel in the pngs, which the x264 encoder cannot handle. Is there a way to get around this problem? Is there, for example, a way to get the encoder to ignore the alpha channel (my pngs don't actually have any transparent elements)? I'm aware that I could batch convert the pngs beforehand to strip the alpha channel, but the sequence of images is produced by another program, and having to preprocess the images each time I make a video would be less than optimal. Edit: After stripping the alpha channel from each frame using the command convert in.png -background white -flatten +matte out.png ffmpeg gives the warning message Incompatible pixel format 'pal8' for codec 'libx264', auto-selecting format 'yuv420p' so still no dice.

    Read the article

  • Why won't 2GB of ram across 3 of 4 slots work on my motherboard (max 2GB)?

    - by Andrew
    My desktop is an old home-built machine circa 200[5-6] running Ubuntu 11.10 (but this is not relevant because I'm reading available ram from BIOS loading screen), with an ASUS P5GPL motherboard, not X or X-SE - it has four slots. I'm mainly a laptop person, but keep this around for running a server from if needed, backing up to, seeding Ubuntu to people from, etc… It has four (DDR) ram slots, two black and two blue, in the order black-blue-black-blue (I will call them D, C, B, and A, respectively) with some space in the middle. The blue ones are the closest to the processor. I used to have two 512MB chips in the two blue slots. I just got a 1GB chip and plugged it into one of the black slots; my system didn't recognize it. I messed around and discovered that it will not recognize chips in many positions, and I couldn't get it to recognize all three of these chips at the same time. In particular, if I put the 512MB chips in A and B it would only use 1, but AC, AD, BD, and CD worked. I didn't try BC, I believe. Only some of these continue to work when I switch the 1GB chip into one of these positions. Can I have some advice as to how to position these chips to get all 2GB used? How about if I get another 1GB chip - where should I put the two? And what about the RAM maximum Crucial says? Can I go above 2GB, if I get another 1GB chip? Right now, I have a 512MB chip in A and the 1GB chip in C. EDIT: I read some other posts and tried dmidecode in Ubuntu to clarify the max memory question, that wasn't a major part anyways. It says my max memory module size is 1024M (OK) and my max memory size is 4096M (doesn't agree with Crucial OR the Asus web site, maybe it will only work while in Linux and BIOS won't OK it?).

    Read the article

  • Stop SQL Server services from conveniently

    - by MedicineMan
    I have a general use laptop. I use it for games, development, and web surfing. I've just installed SQL Server 2008 with Analysis, Reporting, and Error reporting, as well as any of the other options on the installer. I also have a default instance of SQL server as well as a named instance. When I'm not doing development, I'd like to shut down these services conveniently. I'm thinking that a batch file would be good. What are the commands to shut these services down and release the associated memory and resources? It appears that: net stop MSSQLSERVER seems to stop the MSSQLSERVER instance. What about the other services?

    Read the article

  • Application Compatibility Clients do not show in MSSQL database, but do show in \AppCompat\

    - by rjt
    Application Compatibility Clients are not denied access to the central MSSQL database, but are able to leave their own files in the \AppCompat\ share. The only computer that shows up in the "Microsoft Application Compatibility Manager" database is the the machine i initially created the .MSI installer from. The MSI successfullly pushed out via GPO and like i said there are tons of file in the \AppCompat\ share from many different computers. But only 1 pc shows up in the "Data Collection Manager" database, so i only have data from one machine. i could manually add all these machines (ADNETBIOSNAME\MACHINENAME221$) to the MSSQL AppCompat db permissions list or use an SQL command to do so in batch, but i suspect i must have missed something. Do you manually edit the MSI to set the credentials?

    Read the article

  • Apache2 - 500 internal server error

    - by Lucio Coire Galibone
    i'm running a VPS with Linux CentOs 6 with 4 GB of RAM, 10 GB of HD and 2 virtual CPU Intel(R) Xeon(R)CPU L5640 @ 2.27GHz. As my host says each virtual CPU must be at least 0.5 physical cpu. At certain times of the day, those with more traffic, trying accessing my php script i receive intermittently "500 internal server error". I activate logging to debug level from apache, and also the PHP logging with E_ALL, but I can't find reference to Error 500 in any logs(I checked the right logs!). I haven't got any .htaccess file in path script. The strange thing is that the error start at first php line in the script (the previous html displays correctly, but at the first php line the script send 500 error). The cpu load is always good (max 0.15 0.08 0.01) and RAM is close to 95% but it arrived to swap just 2 times in a month with 2-5 MB. Apache works with prefork with this values: <IfModule prefork.c> StartServers 8 MinSpareServers 5 MaxSpareServers 20 ServerLimit 280 MaxClients 280 MaxRequestsPerChild 4000 </IfModule> Everthing works correctly and I don't get any error in quiet times, but i start receive errors when traffic rises (6-9000 visits per hour). Can i solve the problem increasing resources? (i can upgrade RAM up to 16 GB). It can depend from reaching MaxClients (but apache must write it on log, right?)? If I upgrade RAM to 6 or 8 GB i have to calculate MaxClients value with this? MaxClients = Total RAM dedicated to the web server / Max child process size Max child process size is around 20M. How else can the problem be? Thanks in advance

    Read the article

  • Debugging Samba/CUPS printer sharing with Windows

    - by mrdrbob
    I've got a HP Deskjet hooked up to a Slackware 12.2 box. I've got CUPS set up and can print a test page from the box just fine. I've also got Samba set up and have a couple file shares that work fine. I'm trying to share that HP Deskjet out via Samba, but I can't get it to show up in any Windows system. I see the server and its file shares in Windows networking, but when I open the Printers, no printer shows up. Running net view \\servername from the command line lists the file shares, but no printers. Here's the pertinent part of my smb.conf, if that helps: [global] workgroup = HOMENET security = share hosts allow = 192.168.1. 192.168.2. 127. load printers = yes printcap name = cups printing = cups log file = /var/log/samba.%m max log size = 50 [printers] comment = All Printers path = /var/spool/samba browseable = no public = yes writable = no printable = yes guest only = yes Can anyone give me some pointers as to where to start looking for potential causes? Update: Running testparm shows no errors. Here's the output (minus the file shares): [global] workgroup = HOMENET security = SHARE log file = /var/log/samba.%m max log size = 50 printcap name = cups hosts allow = 192.168.1., 192.168.2., 127. [printers] comment = All Printers path = /var/spool/samba guest only = Yes guest ok = Yes printable = Yes browseable = No

    Read the article

  • Converting a bunch of MP3s to mono?

    - by Wil
    I have a bunch of stereo MP3s I'd like to convert to mono. What is the best way to do this? I would prefer something that would let be batch process them. I want to keep the quality as close to the original as possible. My files are also in different bitrates, so I don't want to make all files 320kpbs when some are only 128. Also, is there any quick way to see which files are stereo out of my entire library?

    Read the article

  • Running a scheduled task as SYSTEM with console window open

    - by raoulsson
    I am auto creating scheduled tasks with this line within a batch windows script: schtasks /Create /RU SYSTEM /RP SYSTEM /TN startup-task-%%i /TR %SPEEDWAY_DIR%\%TARGET_DIR%%%i\%STARTUPFILE% /SC HOURLY /MO 1 /ST 17:%%i1:00 I wanted to avoid using specific user credentials and thus decided to use SYSTEM. Now, when checking in the taskmanagers process list or, even better, directly with the C:\> schtasks command itself, all is working well, the tasks are running as intended. However in this particular case I would like to have an open console window where I can see the log flying by. I know I could use C:\> tail -f thelogfile.log if I installed e.g. cygwin (on all machines) or some proprietary tools like Baretail on Windows. But since I only switch to these machines in case of trouble, I would prefer to start the scheduled task in such a way that every user immediately sees the log. Any chance? Thanks!

    Read the article

  • Find which php scripts cause high CPU with php-cgi

    - by Oli
    Background: I maintain a server for a client who has half a dozen Wordpress sites on. They all have the W3 Total Cache plugin installed and eAcellerator is installed (might be APC). All the PHP sites run through a single batch of fastcgi php-cgi processes (it's actually php-fpm but I'm not sure if that makes a difference). Problem: php-cgi's CPU usage is quite high. Not terminally high but high enough to raise an eyebrow. The client wants to add more sites in the future and I want to avoid becoming CPU limited if I can help it. Question: Is there any way I can find the scripts or even just requests that are causing the high CPU. I realise I might not be able to do anything with the results but it would give me a chance.

    Read the article

  • Etag configuration with multiple apache servers or CDN / How does Google do ETags?

    - by perrierism
    I have an application which is served from two apache2 servers and I want to configure the ETags on static content. In the future I would also like to use a CDN. I see that this is supposed to be a problem because the Etag information will be different from server to server... The ETag format for Apache 1.3 and 2.x is inode-size-timestamp. Although a given file may reside in the same directory across multiple servers, and have the same file size, permissions, timestamp, etc., its inode is different from one server to the next. So if you're using more than one webserver to host your app (like 90% of the webapps you use everyday do), it's supposed to be an issue. However I see Google uses Etags, and certainly they use multiple servers and CDN and edge caching, etc... I get a 304 response for any cached Google content. How do they do it? How do you get around the multiple server issue? Is there a way to configure this with Apache?

    Read the article

  • GlusterFS on VMWare ESXi 5

    - by Dharmavir
    I want to build network file system on top of my VMWare ESXi based virtual nodes which are running Ubuntu 12.04 LTS. I am evalaluating options and found that GlusterFS (http://www.gluster.org/) can turn out to be a good choice. Purpose: I have about 2 dozen VM nodes with different configurations, on 2 physical nodes which has following configuration: 16 core Intel Xeon 1 TB 48 GB RAM Now as I said earlier each Physical server has about 1TB hdd and I can increase if I want additional so for now I have 2TB disk space available, these space is distributed in VM nodes I have created on which about 2 dozen VM nodes live. Now some of them being application server and mgmt server, they have plenty of free disk space which I want to utilize for some heavy storage which I can not design if I do that individually on single VM node. This way if my storage is distributed between dozens of VM nodes and about 2 or more physical nodes I have some sort of backup as well. I do not mind if data gets stored redundently but per my knowledge it might hapeen that individual VM nodes will not be able to store all of the data because complete data size for example if we take 100GB will exceed VM disk size of 70GB and then VM will also have system and program files on it. I need some suggestion that will GlusterFS be the solution for which I am looking forward to or I should go with something like hadoop? I am not too sure. But yes, I would like to utilize my free space on each VM node and while doing that if I get store data redundently I am okay because it will give me data security.

    Read the article

  • Help transferring Mac OS X fonts to Windows

    - by Addsy
    I have been sent a QuarkXPress document which contains a number of unusual fonts which were also sent to me. The fonts were in a folder named fonts and then it was all zipped up into a single archive containing the document and fonts folder. I believe this was done on a machine running Mac OS X. I have unzipped the archive, but all of the files in the fonts folder have no size - i.e. they are 0 bytes files. However there is also a _MACOSX folder which has a fonts subfolder. In here I can see all the font files (prefixed with .) and they all have a size, i.e. are not 0 bytes. Presumably these are the actual font files I need but I cannot figure out how to install them on Windows. I have tried renaming the files to have .ttf and .otf extensions and then installing them but that doesn't seem to work - I just get an "Invalid font file" message. Any other ideas?

    Read the article

  • mysql is not connecting after data directory change

    - by user123827
    I've changed data directory in /etc/my.cnf. datadir=/data/mysql socket=/data/mysql/mysql.sock I also moved mysql folder from /var/lib/mysql/ to /data/mysql Now when i connect to mysql i get following error: [root@youradstats-copy mysql]# mysql ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' (2) also when i see /var/logs/msqld.log i get following messages in that: InnoDB: Setting log file /data/mysql/ib_logfile0 size to 512 MB InnoDB: Database physically writes the file full: wait... InnoDB: Progress in MB: 100 200 300 400 500 120704 7:43:31 InnoDB: Log file /data/mysql/ib_logfile1 did not exist: new to be created InnoDB: Setting log file /data/mysql/ib_logfile1 size to 512 MB InnoDB: Database physically writes the file full: wait... InnoDB: Progress in MB: 100 200 300 400 500 InnoDB: Cannot initialize created log files because InnoDB: data files are corrupt, or new data files were InnoDB: created when the database was started previous InnoDB: time but the database was not shut down InnoDB: normally after that. 120704 7:43:36 [ERROR] Plugin 'InnoDB' init function returned error. 120704 7:43:36 [ERROR] Plugin 'InnoDB' registration as a STORAGE ENGINE failed. I shut down mysql properly before doing these changes and then started it properly but dont know why getting these messages. please help to solve issue as i have changed socket path in my.cnf but still its pointing to old path...

    Read the article

  • How to crash a program

    - by user2949019
    I have a program called BlueCoat Proxy installed on my school issued laptop that basically blocks every second website on the Internet, including stack exchange, YouTube and yahoo answers. I do not have administrator rights, nor can I delete anything in program files, I tried every possible method of obtaining admin rights. It is not accessible in task manager (it doesn't even appear there). I tried to close it with Windows command prompt through commands like 'taskkill' but it returns 'Access is Denied' (I'm only denied access with that program). Does anyone know a method of crashing a program with a batch file or VB program? I was thinking something like the ping command, though for a program. Maybe automating 1000 meaningless requests to the program? Your input on the subject matter is appreciated, however telling me that this is wrong or illegal is not.

    Read the article

  • How do you monitor SSD wear in Windows when the drives are presented as 'generic' devices?

    - by MikeyB
    Under Linux, we can monitor SSD wear fairly easily with smartmontools whether the drive is presented as a normal block device or a generic device (which happens when the drive has been hardware RAIDed by certain controllers such as the one on the IBM HS22). How can we do the equivalent under Windows? Does anyone actually use smartmontools? Or are there other packages out there? The problem is that SCSI Generic devices just don't show up in Windows. If the drives aren't RAIDed we can see them fine. How I'd do it in Linux: sles11-live:~ # lsscsi -g [1:0:0:0] disk SMART USB-IBM 8989 /dev/sda /dev/sg0 [2:0:0:0] disk ATA MTFDDAK256MAR-1K MA44 - /dev/sg1 [2:0:1:0] disk ATA MTFDDAK256MAR-1K MA44 - /dev/sg2 [2:1:8:0] disk LSILOGIC Logical Volume 3000 /dev/sdb /dev/sg3 sles11-live:~ # smartctl -l ssd /dev/sg1 smartctl 5.42 2011-10-20 r3458 [x86_64-linux-2.6.32.49-0.3-default] (local build) Copyright (C) 2002-11 by Bruce Allen, http://smartmontools.sourceforge.net Device Statistics (GP Log 0x04) Page Offset Size Value Description 7 ===== = = == Solid State Device Statistics (rev 1) == 7 0x008 1 26~ Percentage Used Endurance Indicator |_ ~ normalized value sles11-live:~ # smartctl -l ssd /dev/sg2 smartctl 5.42 2011-10-20 r3458 [x86_64-linux-2.6.32.49-0.3-default] (local build) Copyright (C) 2002-11 by Bruce Allen, http://smartmontools.sourceforge.net Device Statistics (GP Log 0x04) Page Offset Size Value Description 7 ===== = = == Solid State Device Statistics (rev 1) == 7 0x008 1 3~ Percentage Used Endurance Indicator |_ ~ normalized value

    Read the article

  • How to compress .pdfs in word 2007?

    - by chobo2
    Hi I am trying to send my cover letter and resume away but apparently it is too big to send through craigs list(my computer says the total size is 500kb) as it has a 600kb limit(so small should be at least a meg). Hi there. You recently tried to email Some job Email, an anonymous craigslist address. However, your message was too big to be sent through our system. Craigslist has a 600KB limit on the messages we'll send. Please reduce the size of your mail and try again. Thanks for using craigslist. So when I convert my word 2007(.docx) files to pdf they become huge. Like they got from 32kb to 320kb. So is there a way I can either get around craigslist limits or compress my pdfs a bit to make it happy. I don't want to send zips and stuff since the person who gets it might not even know what to do. I rather not send .docx since not sure if will have office 2007 or the compatibility view installed and I rather just send it as pdf(as some place require it anyways to be in pdfs). Thanks

    Read the article

  • Is it possible to mount a disk image, created with dd, to a directory on a mounted external usb hdd?

    - by Keeper Hood
    I have an image of my home (/dev/sda3) partition, which I've created using the "dd" command. dd if=/dev/sda3 of=/path/to/disk.img I've deleted the home partition via gparted in order to enlarge my /dev/root partition. Then I've recreated the /dev/sda3 partition which is smaller in size then the one I've backed up to the image. I was wondering since I have a 2TB external HDD, could it be possible to mount my backed up image on the external HDD and then copy the files into the /home directory. Since the external HDD would be already in a "mounted state", I'm unsure whether this is a good idea, mounting on a mounted device. I'm running Slackware 13.37 (64bit). used ext4 on all the partitions. resized the root partition with gparted live cd. I've tried mount -t ext4 /path/to/disk.img /mng/image -o loop It gave me an fs error (wrong fs type, bad option, bad superblock on dev/loop/0) Then i did dmesg | tail which outputs: EXT4-fs (loop0) : bad geometry: block count 29009610 exceeds size of defice (1679229 blocks) I have no idea what to do, I want to restore my /home data from the image I've backed up.

    Read the article

  • Tape Supplier in Canada

    - by JohnyD
    I'm not sure if it's kosher to ask.. but where can one purchase good quality / well priced backup tapes within Canada, online? I'm asking because my current vendor, DataWrite, has sent me bad tapes for the 3rd time in a row. I have no idea where they get their product but when I put an order in for 20 tapes and 2 cleaning tapes and I have to call them up 4 times over the next 2 weeks, I expect those tapes to work. Well, like I said, this is the 3rd batch that I'm sending back and we've been dealing with them exclusively for well over 5 years so I thought I'd ask the pool of knowledge. I'm currently looking for DAT72 tapes but soon will be looking for LTO-3 and LTO-4. Thank you

    Read the article

  • Software to manage "application profiles", which services are running, etc., on Windows?

    - by Lasse V. Karlsen
    I am looking for a particular type of program. I use my computers in various "modes", like gaming, programming, just plain surfing, etc. What I'd like to find is a program that can help me manage these modes. For instance, while programming I might use SQL Server, but while gaming I don't want those services running, but perhaps I'd like Steam to run instead. Basically, the program type I'm looking for is a visual program that allow me to quickly switch modes, and when I do, the program would start and stop the necessary services and applications in order to leave one mode and enter another. I've looked at the programs related to startup management, and I haven't found one that lets me do what I want. At the moment I have batch files, but they're not very good at conveying problems or other things, I'd like a more visual program.

    Read the article

  • How do photoshop slices and layer comps interact?

    - by Steve314
    I'm interested in using Photoshop (I have CS2) for some user interface design. I was hoping to be able to use slices and layer comps to mark out particular elements, and use Javascript scripting to export multiple graphics files and text descriptions (positions and sizes of slices mainly) that will be used by my program. My problem is that I've never used Photoshop for web design, or otherwise used slices, and I'm not confident that I understand how they interact with layer comps. This is what I believe (and hope) is correct... Manual slices aren't affected by layer comps in any way - they aren't saved as part of a layer comp. The same manual slices will be active irrespective of which layer comp is selected. Layer-based slices aren't directly affected by layer comps, but they are indirectly affected in that the layer comp saves details of layer position and style. Thus selecting a layer comp may move a layer and change its style, affecting the location and size of its layer-based slice, or may effectively disable the slice by hiding the layer. Automatic slices aren't directly affected by layer comps, but are indirectly affected due to changes to the layer-based slices. So, layer based slices (which are my main interest) may move, may change size (to accomodate a style such as a drop shadow), and may be effectively disabled by the layer being hidden. Other details (and all details of manual slices) will remain constant irrespective of which layer comp is active. Is that correct?

    Read the article

  • sqlcmd backup script failing

    - by Bryan
    I'm trying to use a simple batch script to backup a local instance of SQL Express 2012, as follows: @echo off SET BACKUP_DIR=E:\BackupData SET SERVER=.\\sqlexpress set dom=%date:~0,2% set month=%date:~3,2% set year=%date:~6,4% set file=%year%-%month%-%dom% sqlcmd -S %SERVER% -d master -Q "exec sp_msforeachdb 'BACKUP DATABASE [?] TO DISK=''%BACKUP_DIR%\?.Full.%file%.bak''' The script is failing to run with the following error: Sqlcmd: Error: Microsoft SQL Server Native Client 10.0 : Client unable to establish connection due to prelogin failure. This is on Server 2008 R2, my SQL database (on localhost) instance is named SQLEXPRESS. There is an instance of SQL Express 2008 on the system (hence client 10.0). The database is configured to use a trusted connection, and the .net desktop software deployed on our network PCs is able to access the database without any problem. Am I missing something obvious here, I've done a fair amount of searching for this error message, and haven't found anything that has been particularly useful so far.

    Read the article

  • XP Pro repair CD - does Pro/Home disc matter?

    - by Moshe
    I'm trying to repair an xp pro install using the repair cd The issue is a corrupted registry. Following the MS help center instructions, I made a batch file to replace the registry. It's not working. Does the CD need to match the edition and service pack of Windows on the disc? How can I tell (retroactively, with no Windows access) what edition/sp is installed? Any other ideas/thoughts? I am considering an pgrade install to Windows 7... TH\hanks!

    Read the article

  • [Ubuntu 10.04] mdadm - Can't get RAID5 Array To Start

    - by Matthew Hodgkins
    Hello, after a power failure my RAID array refuses to start. When I boot I have to sudo mdadm --assemble --force /dev/md0 /dev/sdb1 /dev/sdc1 /dev/sdd1 /dev/sde1 /dev/sdf1 /dev/sdg1 to get mdadm to notice the array. Here are the details (after I force assemble). sudo mdadm --misc --detail /dev/md0: /dev/md0: Version : 00.90 Creation Time : Sun Apr 25 01:39:25 2010 Raid Level : raid5 Used Dev Size : 1465135872 (1397.26 GiB 1500.30 GB) Raid Devices : 6 Total Devices : 6 Preferred Minor : 0 Persistence : Superblock is persistent Update Time : Thu Jun 17 23:02:38 2010 State : active, Not Started Active Devices : 6 Working Devices : 6 Failed Devices : 0 Spare Devices : 0 Layout : left-symmetric Chunk Size : 128K UUID : 44a8f730:b9bea6ea:3a28392c:12b22235 (local to host hodge-fs) Events : 0.1249691 Number Major Minor RaidDevice State 0 8 65 0 active sync /dev/sde1 1 8 81 1 active sync /dev/sdf1 2 8 97 2 active sync /dev/sdg1 3 8 49 3 active sync /dev/sdd1 4 8 33 4 active sync /dev/sdc1 5 8 17 5 active sync /dev/sdb1 mdadm.conf: # by default, scan all partitions (/proc/partitions) for MD superblocks. # alternatively, specify devices to scan, using wildcards if desired. DEVICE partitions /dev/sdb1 /dev/sdb1 # auto-create devices with Debian standard permissions CREATE owner=root group=disk mode=0660 auto=yes # automatically tag new arrays as belonging to the local system HOMEHOST <system> # definitions of existing MD arrays ARRAY /dev/md0 level=raid5 num-devices=6 UUID=44a8f730:b9bea6ea:3a28392c:12b22235 Any help would be appreciated.

    Read the article

  • FOR command cannot see hidden files

    - by Synetech
    I’m struggling with one of the most frustrating bugs I’ve ever come across. Bug description:   The for command of the command-interpreter cannot see hidden files. Reproduction steps: Create a temporary directory Create a few files Assign a variety of attributes to the files (including hidden) Use a command like for %i in (*) do echo "%i" Expected results:    All files are processed in the for loop either by default or though a switch. Actual results: Files with any attribute other than hidden are processed; files flagged as hidden are skipped There is no switch to the for command to allow it to process hidden files Implications:    There is no way to process all files from the command-prompt. Question:    How the heck can hidden files be processed from the command-prompt or batch-files (at least in Windows if not DOS)?

    Read the article

< Previous Page | 338 339 340 341 342 343 344 345 346 347 348 349  | Next Page >