Search Results

Search found 72319 results on 2893 pages for 'file explorer'.

Page 323/2893 | < Previous Page | 319 320 321 322 323 324 325 326 327 328 329 330  | Next Page >

  • How to get current datetime on windows command line, in a suitable format for using in a filename?

    - by Rory
    What's a windows command line statement(s) I can use to get the current datetime in a format that I can put into a filename? I want to have a .bat file that zips up a directory into an archive with the current date & time as part of the name, eg "Code_2008-10-14_2257.zip". Is there any easy way I can do this, independent of the regional settings of the machine? I don't really mind about the date format, ideally it'd be yyyy-mm-dd but anything simple is fine. So far I've got this, which on my machine gives me "Tue_10_14_2008_230050_91" rem Get the datetime in a format that can go in a filename. set _my_datetime=%date%_%time% set _my_datetime=%_my_datetime: =_% set _my_datetime=%_my_datetime::=% set _my_datetime=%_my_datetime:/=_% set _my_datetime=%_my_datetime:.=_% rem now use the timestamp by in a new zip file name "d:\Program Files\7-Zip\7z.exe" a -r Code_%_my_datetime%.zip Code I can live with this but it seems a bit clunky. Ideally it'd be briefer and have the format mentioned earlier. I'm using Windows Server 2003 and Win XP Pro. I don't want to install additional utilities to achieve this (although I realise there are some that will do nice date formatting).

    Read the article

  • Problem with sending cookies with file_get_contents

    - by Ikke
    Hi, i'm trying to get the contents from another file with file_get_contents (don't ask why). I have two files: test1.php and test2.php. Test1.php returns a string, bases on the user that is logged in. Test2.php tries to get the contents of test1.php and is being executed by the browser, thus getting the cookies. To send the cookies with file_get_contents, i create a streaming context: $opts = array('http' => array('header'=> 'Cookie: ' . $_SERVER['HTTP_COOKIE']."\r\n"))`; I'm retreiving the contents with: $contents = file_get_contents("http://www.domain.com/test1.php", false, $opts); But now I get the error: Warning: file_get_contents(http://www.domain.com/test1.php) [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found Does somebody knows what i'm doing wroing here? edit: forgot to mention: Without the streaming_context, the page just loads. But withouth the cookies I don't get the info I need.

    Read the article

  • Windows batch - loop over folder string and parse out last folder name

    - by Tim Peel
    Hi, I need to grab the folder name of a currently executing batch file. I have been trying to loop over the current directory using the following syntax (which is wrong at present): set mydir = %~p0 for /F "delims=\" %i IN (%mydir%) DO @echo %i Couple of issues in that I cannot seem to pass the 'mydir' variable value in as the search string. It only seems to work if I pass in commands; I have the syntax wrong and cannot work out why. My thinking was to loop over the folder string with a '\' delimiter but this is causing problems too. If I set a variable on each loop then the last value set will be the current folder name. For example, given the following path: C:\Folder1\Folder2\Folder3\Archive.bat I would expect to parse out the value 'Folder3'. I need to parse that value out as its name will be part of another folder I am going to create further down in the batch file. Many thanks if anyone can help. I may be barking up the wrong tree completely so any other approaches would be greatly received also. Tim

    Read the article

  • $HTTP_RAW_POST_DATA and Wordpress media_handle_upload

    - by BaronVonKaneHoffen
    Hi there! So I'm making something for which users need to upload images from a canvas element (on the front end) using Wordpress's in-built Media Upload API. I've successfully uploaded images with the API from a file input in a form: <input type="file" name="async-upload" id="async-upload" size="40" /> ... <?php $new_attach_id = media_handle_upload( 'async-upload', $new_id ); ?> and I've saved images from the canvas using my own script: <script type="text/javascript"> ... var img = canvas.toDataURL("image/png"); ... ajax.send(img ); </script> ... <?php $imageData=$GLOBALS['HTTP_RAW_POST_DATA']; $filteredData=substr($imageData, strpos($imageData, ",")+1); $unencodedData=base64_decode($filteredData); $fp = fopen( 'saved_images/canv_save_test.png', 'wb' ); fwrite( $fp, $unencodedData); ... ?> Problem is, Wordpress's media_handle_upload() only accepts an index to the $_FILES array for the upload. So the question is: How can I pass the image data from HTTP_RAW_POST_DATA to it? Could I make the $_FILES['tmp-name'] point to it somehow? Could I use another script as an intermediate step somehow? Could I unleash the monkeys on the typewriter until they come up with the answer? Any help very very much appreciated! Thanks, Kane

    Read the article

  • How should I name my SQL query files? Should I use some methodology?

    - by Mehper C. Palavuzlar
    We have an Oracle 10g database (a huge one) in our company, and I provide employees with data upon their requests. My problem is, I save almost every SQL query I wrote, and now my list has grown too much. I want to organize and rename these .sql files so that I can find the one I want easily. At the moment, I'm using some folders named as Sales Dept, Field Team, Planning Dept, Special etc. and under those folders there are .sql files like Delivery_sales_1, Delivery_sales_2, ... Sent_sold_lostsales_endpoints, ... Sales_provinces_period, Returnrates_regions_bymonths, ... Jack_1, Steve_1, Steve_2, ... I try to name the files regarding their content but this makes file names longer and does not completely meet my needs. Sometimes someone comes and demands a special report, and I give the file his name, but this is also not so good. I know duplicates or very similar files are growing in time but I don't have control over them. Can you show me the right direction to rename all these files and folders and organize my queries for easy and better control? TIA.

    Read the article

  • Batch script is not executed if chcp was called

    - by Andy
    Hello! I'm trying to delete some files with unicode characters in them with batch script (it's a requirement). So I run cmd and execute: > chcp 65001 Effectively setting codepage to UTF-8. And it works: D:\temp\1>dir Volume in drive D has no label. Volume Serial Number is 8C33-61BF Directory of D:\temp\1 02.02.2010 09:31 <DIR> . 02.02.2010 09:31 <DIR> .. 02.02.2010 09:32 508 1.txt 02.02.2010 09:28 12 delete.bat 02.02.2010 09:20 95 delete.cmd 02.02.2010 09:13 <DIR> Rún 02.02.2010 09:13 <DIR> ????? ??????? 3 File(s) 615 bytes 4 Dir(s) 11 576 438 784 bytes free D:\temp\1>rmdir Rún D:\temp\1>dir Volume in drive D has no label. Volume Serial Number is 8C33-61BF Directory of D:\temp\1 02.02.2010 09:56 <DIR> . 02.02.2010 09:56 <DIR> .. 02.02.2010 09:32 508 1.txt 02.02.2010 09:28 12 delete.bat 02.02.2010 09:20 95 delete.cmd 02.02.2010 09:13 <DIR> ????? ??????? 3 File(s) 615 bytes 3 Dir(s) 11 576 438 784 bytes free Then I put the same rmdir commands in batch script and save it in UTF-8 encoding. But when I run nothing happens, literally nothing: not even echo works from batch script in this case. Even saving script in OEM encoding does not help. So it seems that when I change codepage to UTF-8 in console, scripts just stop working. Does somebody know how to fix that?

    Read the article

  • Out-of-memory algorithms for addressing large arrays

    - by reve_etrange
    I am trying to deal with a very large dataset. I have k = ~4200 matrices (varying sizes) which must be compared combinatorially, skipping non-unique and self comparisons. Each of k(k-1)/2 comparisons produces a matrix, which must be indexed against its parents (i.e. can find out where it came from). The convenient way to do this is to (triangularly) fill a k-by-k cell array with the result of each comparison. These are ~100 X ~100 matrices, on average. Using single precision floats, it works out to 400 GB overall. I need to 1) generate the cell array or pieces of it without trying to place the whole thing in memory and 2) access its elements (and their elements) in like fashion. My attempts have been inefficient due to reliance on MATLAB's eval() as well as save and clear occurring in loops. for i=1:k [~,m] = size(data{i}); cur_var = ['H' int2str(i)]; %# if i == 1; save('FileName'); end; %# If using a single MAT file and need to create it. eval([cur_var ' = cell(1,k-i);']); for j=i+1:k [~,n] = size(data{j}); eval([cur_var '{i,j} = zeros(m,n,''single'');']); eval([cur_var '{i,j} = compare(data{i},data{j});']); end save(cur_var,cur_var); %# Add '-append' when using a single MAT file. clear(cur_var); end The other thing I have done is to perform the split when mod((i+j-1)/2,max(factor(k(k-1)/2))) == 0. This divides the result into the largest number of same-size pieces, which seems logical. The indexing is a little more complicated, but not too bad because a linear index could be used. Does anyone know/see a better way?

    Read the article

  • Uniquely identify files/folders in NTFS, even after move/rename

    - by Felix Dombek
    I haven't found a backup (synchronization) program which does what I want so I'm thinking about writing my own. What I have now does the following: It goes through the data in the source and for every file which has its archive bit set OR does not exist in the destination, copies it to the destination, overwriting a possibly existing file. When done, it checks for all files in the destination if it exists in the source, and if it doesn't, deletes it. The problem is that if I move or rename a large folder, it first gets copied to the destination even though it is in principle already there, just has a different path. Then the folder which was already there is deleted afterwards. Apart from the unnecessary copying, I frequently run into space problems because my backup drive isn't large enough to hold the original data twice. Is there a way to programmatically identify such moved/renamed files or folders, i.e. by NTFS ID or physical location on media or something else? Are there solutions to this problem? I do not care about the programming language, but hints for doing this with Python, C++, C#, Java or Prolog are appreciated.

    Read the article

  • std::ifstream buffer caching

    - by ledokol
    Hello everybody, In my application I'm trying to merge sorted files (keeping them sorted of course), so I have to iterate through each element in both files to write the minimal to the third one. This works pretty much slow on big files, as far as I don't see any other choice (the iteration has to be done) I'm trying to optimize file loading. I can use some amount of RAM, which I can use for buffering. I mean instead of reading 4 bytes from both files every time I can read once something like 100Mb and work with that buffer after that, until there will be no element in buffer, then I'll refill the buffer again. But I guess ifstream is already doing that, will it give me more performance and is there any reason? If fstream does, maybe I can change size of that buffer? added My current code looks like that (pseudocode) // this is done in loop int i1 = input1.read_integer(); int i2 = input2.read_integer(); if (!input1.eof() && !input2.eof()) { if (i1 < i2) { output.write(i1); input2.seek_back(sizeof(int)); } else input1.seek_back(sizeof(int)); output.write(i2); } } else { if (input1.eof()) output.write(i2); else if (input2.eof()) output.write(i1); } What I don't like here is seek_back - I have to seek back to previous position as there is no way to peek 4 bytes too much reading from file if one of the streams is in EOF it still continues to check that stream instead of putting contents of another stream directly to output, but this is not a big issue, because chunk sizes are almost always equal. Can you suggest improvement for that? Thanks.

    Read the article

  • download large files using servlet

    - by niks
    I am using Apache Tomcat Server 6 and Java 1.6 and am trying to write large mp3 files to the ServletOutputStream for a user to download. Files are ranging from a 50-750MB at the moment. The smaller files aren't causing too much of a problem but with the larger files it and getting socket exception broken pipe. File fileMp3 = new File(objDownloadSong.getStrSongFolder() + "/" + strSongIdName); FileInputStream fis = new FileInputStream(fileMp3); response.setContentType("audio/mpeg"); response.setHeader("Content-Disposition", "attachment; filename=\"" + strSongName + ".mp3\";"); response.setContentLength((int) fileMp3.length()); OutputStream os = response.getOutputStream(); try { int byteRead = 0; while ((byteRead = fis.read()) != -1) { os.write(byteRead); } os.flush(); } catch (Exception excp) { downloadComplete = "-1"; excp.printStackTrace(); } finally { os.close(); fis.close(); }

    Read the article

  • c++ simple conditional logging

    - by Sunny
    Disclaimer: I'm not a c++ developer, I can only do basic things. (I understand pointers, just my knowledge is so rusty, I haven't touch c/c++ for about 20 years :) ) The setup: I have an Outlook addin, written in C#/.Net 1.1. It uses a c++ shim to load. Usually, this works pretty well, and I use in my c# code nlog for logging purposes. But sometimes, the addin fails to load, i.t. it does not hit the managed code at all for me to be able to investigate the problem from the log files. So, I need to hook some basic logging into the c++ shim - just writing in a file. I need to make it as simple as possible for our users to enable. Actually I would prefer not to ship it by default. I was thinking about something, which will check if a specific dll is present (the logging dll), and if so, to use it. Otherwise, it will just not log anything. That way, when I have a user with such a problems, I can send him only the logging dll, the user will save it in the runtime directory, and I'll have the file. I guess this have to be done with some form a factory solution, which returns either a dummy logger, or if the dll is found, a real one. Another option would be to make some simple logger, and rebuild the shim with or w/o using it, based on directives. This is not the desirable approach, because the shim needs to be signed, and I have to instruct the user to make a backup copy of the "real" one, then restore when done, etc., instead of just saving and deleting a dll. I'd appreciate any good suggestion how to approach it, together with links or sample code how to go after this. Cheers

    Read the article

  • Know the row with max characters (C)

    - by l_core
    I have wrote a program in C, to find the row with the max number of characters. Here is the code #include <stdio.h> #include <stdlib.h> #include <ctype.h> #include <string.h> int main (int argc, char *argv[]) { char c; /* used to store the character with getc */ int c_tot = 0, c_rig = 0, c_max = 0; /* counters of characters*/ int r_tot = 0; /* counters of rows */ FILE *fptr; fptr = fopen(argv[1], "r"); if (fptr == NULL || argc != 2) { printf ("Error opening the file %s\n'", argv[1]); exit(EXIT_FAILURE); } while ( (c = getc(fptr)) != EOF) { if (c != ' ' && c != '\n') { c_tot++; c_rig++; } if (c == '\n') { r_tot++; if (c_rig > c_max) c_max = c_rig; c_rig = 0; } } printf ("Total rows: %d\n", r_tot); printf ("Total characters: %d\n", c_tot); printf ("Total characters in a row: %d\n", c_max); printf ("Average number of characters on a row: %d\n", (c_tot/r_tot)); printf ("The row with max characters is: %s\n", ??????) return 0; } I can easily find the row with the highest number of characters but how can i print that out? Thank You Folks

    Read the article

  • nginx php5-fpm "File not found" -- FastCGI sent in stderr: "Primary script unknown"

    - by jmfayard
    so I'm trying to run for the first time the nginx web server with php5-fpm on a debian wheezy server Hitting a php file display simply File not found I have done my research (waste a lot of hours actually ;), there are a lot of people that have similar problems, yet I didn't succeed to correct it with what worked for them. I still have the same error : $ tail /var/log/nginx/access.log /var/log/nginx/error.log /var/log/php5-fpm.log | less == /var/log/nginx/error.log <== 2013/10/26 21:36:00 [error] 6900#0: *1971 FastCGI sent in stderr: "Primary script unknown" while reading response header from upstream, I have tried a lot of things, it's hard to remember what. I have put my config files on github my /etc/nginx/nginx.conf my /etc/php5/fpm/php-fpm.conf Currently, the nginx.conf configuration uses this... server { server_name mydomain.tld; root /srv/data1/test; location ~ \.php$ { try_files $uri =404; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } } /etc/php5/fpm/pool.d/www.conf contains listen = 127.0.0.1:9000 I have tried the unix socket version, same thing. fastcgi_pass unix:/var/run/php5-fpm.sock; I made sure the server is started $ netstat -alnp | grep LISTEN tcp 0 0 127.0.0.1:9000 0.0.0.0:* LISTEN 6913/php-fpm.conf) tcp 0 0 127.0.0.1:3306 0.0.0.0:* LISTEN 4785/mysqld tcp 0 0 0.0.0.0:842 0.0.0.0:* LISTEN 2286/inetd tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN 2812/rpcbind tcp 0 0 0.0.0.0:80 0.0.0.0:* LISTEN 5710/nginx tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN 2560/sshd tcp 0 0 0.0.0.0:443 0.0.0.0:* LISTEN 5710/nginx tcp6 0 0 :::111 :::* LISTEN 2812/rpcbind unix 2 [ ACC ] STREAM LISTENING 323648 6574/tmux /tmp//tmux-1000/default unix 2 [ ACC ] STREAM LISTENING 619072 6790/fcgiwrap /var/run/fcgiwrap.socket unix 2 [ ACC ] SEQPACKET LISTENING 323 464/udevd /run/udev/control unix 2 [ ACC ] STREAM LISTENING 610686 2812/rpcbind /var/run/rpcbind.sock unix 2 [ ACC ] STREAM LISTENING 318633 4785/mysqld /var/run/mysqld/mysqld.sock Each time I modify the nginx.conf file, I make sure to relaunch this command nginx -t && nginx -s reload && echo "nginx configuration reloaded" and same thing for php5-fpm /etc/init.d/php5-fpm restart Thanks for your help :-)

    Read the article

  • CertMgr fails trying to import an SPC file

    - by nsr81
    We have an SPC files which came with the Cisco IP Communicator installer. It needs to be imported into the localMachine ROOT store. However, which the certmgr.exe is run against this SPC file, it errors out. Doesn't matter if it's run from within the installer or manually. The commands I've tried using are: certmgr.exe -add -all CDPcredentials.spc -s -r localMachine root The result displayed is: Error: Failed to save to the destination store CertMgr Failed There is no other information, no log file, nothing in the eventviewer. I's almost as if the ROOT store is in a read-only state. I would also like to point out that I'm able to import single certificates. Just not an SPC files, which contains multiple certificates. I have also tried different versions of the CertMgr utility. Running on Windows 7 Enterprise 64bit. Any assistance would be appreciated.

    Read the article

  • File system loop detected in /var/named/chroot/var/named/ CentOS6.3

    - by wilco
    When I use find command on shell, I got the following error. find: File system loop detected; /var/named/chroot/var/named' is part of the same file system loop as/var/named'. I verified the inode number and it comes out the same as below. [root@serverone ~]# ls -ldi /var/named/chroot/var/named/ /var/named 6684673 drwxr-x--- 6 root named 4096 Sep 7 17:17 /var/named 6684673 drwxr-x--- 6 root named 4096 Sep 7 17:17 /var/named/chroot/var/named/ I cannot remove the directory with rm -f and it is saying this is directory. It is minimal CentOS6.3 install with plesk 11. Any help would be appreciated.

    Read the article

  • Unable to format disk: 'The system cannot find the file specified'

    - by ACarter
    I have a USB flash drive, which I may have mucked up, so I used DISKPART's CLEAN to clean it up. I created a simple volume, and tried to format it. (This is all using Windows' disk management.) I was told The system cannot find the file specified. So I tried using DISKPART (as an admin): DISKPART select volume 9 Volume 9 is the selected volume. DISKPART format recommended DiskPart has encountered an error: The system cannot find the file specified. See the System Event Log for more information. DISKPART As you can see, no luck. When I plug the drive in, the computer makes a beep noise as though it has recognised something, but nothing appears in My Computer How can I format the disk so I can use it again?

    Read the article

  • Nginx + PHP-FPM on Ubuntu giving "upstream sent invalid status" on uploading Joomla extension zip file

    - by faridv
    I have a Ubuntu server running in an ESV VM emvironment and I've installed a webserver with this configuration: Nginx 1.0.5 PHP 5.3.6 with PHP-FMP Mysql 5.1.62 I have an installation of latest version of joomla on this server and when I try to upload an install package (zip file containing joomla's extension files) I get "502 Bad Gateway" with the following error in nginx log file: 2012/05/13 11:22:21 [error] 19911#0: *20 upstream sent invalid status "-1 Copy failed" while reading response header from upstream, client: 10.10.56.70, server: localhost, request: "POST /administrator/index.php?option=com_installer&view=install HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "radio.xx.xx", referrer: "http://radio.xx.xx/administrator/index.php?option=com_installer" I've searched all over the internet and I've changed too many parameters of php configuration, fpm config and nginx config including increasing execution times and etc but my problem still remains. I'm pretty sure it has nothing to do with my Joomla and problem is in webserver but there's no usable log messages, except above message. Can anyone help me with this problem?

    Read the article

  • Unable to execute file in the temporary directory

    - by Bixal
    I am using Windows 8.1 Pro 64-bit. I see this error, almost everytime I launch an executable file (to install it) but not for all of them. I don't see the error when I use Run as Administrator. I looked around, and found a solution: I needed to give permissions to the current user for the temp file as shown in the picture below: The problem here is solved temporarily, but it goes back to give me the same problem after restarting the PC. What can I do to prevent such a thing? I don't really want to use the built in Administrator account all the time. Update: The problem is caused by the cracked version of Adobe Acrobat. And the root cause is the cracked amtlib.dll Read more here: http://www.sabernova.com/2013/12/cracked-adobe-acrobat-xi-will-revert.html#axzz2r8VSzZi9

    Read the article

  • Mac server default file permissions

    - by Bobby Jack
    How do I change the default file permissions for files created on a Mac server? In case it's relevant, this is a Mac Mini running Mac OS 10.6.7. It's currently used mainly as a file server, and there are several users who need to share files. These files need to be writable by all, rather than the default which is writable only by the owner. I've been trying to do something with umask and a startup script, but I'm not sure there's a startup script that will apply to connections via Finder. I also need this to apply to files created on a client (also Macs) and copied onto the server.

    Read the article

  • runit - unable to open supervise/ok: file does not exist

    - by Alexandr Kurilin
    I'm trying to figure out why runit will not boot or give me the status for the managed applications. Running on Ubuntu 12.04. I created /service, /etc/sv/myapp (with a run script, a config file, a log folder and a run script inside of it). I create a symlink from /service/ to /etc/sv/myapp When I run sudo sv s /service/* I get the following error message: warning: /service/myapp: unable to open supervice/ok: file does not exist Some of my Googling revealed that supposedly rebooting the svscan service might fix this, but killing it and running svscanboot didn't make a difference. Any suggestions? Am I missing a step here somewhere?

    Read the article

  • gparted installed on OpenSuse shows all file system types as greyed out except for hfs

    - by cmdematos.com
    I have had this problem before and fixed it, but I don't recall how I did it and I did not record it (sadness :( ) I have all the requisite commands installed on OpenSuse to support gparted's efforts in creating any of the supported file systems. I recall that the problem was that gparted could not find the commands, in any event all the file systems are greyed out in the context menu except for the legacy hfs partition which only supports < 2gb. Even extfs2-extfs4 are greyed out. How do I fix this?

    Read the article

  • Conky truncates text loaded from file

    - by takeshin
    I'm trying to configure conky on Ubuntu, because I need to display my todo list on the desktop. The the file is displayed, but the text is truncated (not rectangular, just after some character limit). How to display the whole file? Here is my setup: # Text alignment, other possible values are commented alignment top_right # Gap between borders of screen and text gap_x 10 gap_y 10 # Maximum size of buffer for user text, i.e. below TEXT line. max_user_text 16384 # stuff after 'TEXT' will be formatted on screen TEXT ${execi 30 cat /home/user/Documents/todo.txt}

    Read the article

  • nginx tmp file folder runing out of diskspace

    - by user1179459
    I get mysql diskspace error Can't create/write to file '/tmp/#sql_777_0.MYI' (Errcode: 28) mainly because my ngnix server is writing file into the tmp folder which doesn't get clean up.. i added this command as per instructions on the nginx manual to the crontab but doesn't seems to be doing the trick, (i don't understand what it does too) 0 */1 * * * /usr/sbin/tmpwatch -am 1 /tmp/nginx_client then i had to do this commands mannually cd /tmp/nginx_client find -name * | xargs rm i need to know what should i do to automate this clean up ? is there way to increase the /tmp/ - /var/tmp/ size without reformatting or doing any dangerous things ? Can i change the location of the MYSQL - TMP files ?

    Read the article

< Previous Page | 319 320 321 322 323 324 325 326 327 328 329 330  | Next Page >