Search Results

Search found 39051 results on 1563 pages for 'temporary files'.

Page 8/1563 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Reading log files from web application

    - by Egorinsk
    I want to write a small PHP application for monitoring logs on a Debian server, including syslog logs and Apache/PHP messages. The problem here is that Apache user (www-data) has no access to /var/log directory. What would be the best way to grant an access to logs for PHP application? Let's assume that log files can be really large, like hundreds of megabytes. I have some ideas: Write a shell script that would be run via sudo and tail last 512 Kb of log into a separate file that can be read by application - that's ineffective, because of forking a new process and having to read data twice Add www-data to adm group (that can read logs) - that's insecure Start a PHP process via cron every minute to read logs — that's not very good, because it doesn't allow real-time monitoring. Also, this script will be started even when I don't read logs, and consume CPU time (server is in the cloud, and I'll have to pay for it) Create a hardlink for all log files with lowered permissions - I guess, that won't work because logrotate could recreate log files and they'll change inode number. Start a separate nginx/Apache server under privileged user that may read logs. Maybe anyone got a better solution?

    Read the article

  • Moving hidden files/folders with the command-line or batch-file

    - by Synetech
    Question Does anyone know of a way to move files and folders that have the hidden, system, or read-only attribute set from the command-line or a batch file? (No, stripping the attributes first is not an option since there is no practical way to know which attributes were set in order to re-set them after the move.) (Failed) Attempts Using the basic move command does not work with items with the hidden or system attribute set and for some reason, it does not have switches to specify attributes like the dir and del commands do. I tried using a utility I wrote that uses the shell’s file operation function, but that requires using start /w to prevent the batch file from running on ahead, and it complains about long-filename support for some reason. I tried using robocopy, but it first copies the files and then deletes the originals instead of simply moving the source (which results in a frustrating delay, even with the excessive output redirected to nul). (Surprisingly it seems that few people have ever needed to move hidden files from the command-line. All I could find was this one person who abandoned the attempt.)

    Read the article

  • Reading log files from web application

    - by Egorinsk
    Hi! I want to write a small PHP application for monitoring logs on a Debian server, including syslog logs and Apache/PHP messages. The problem here is that Apache user (www-data) has no access to /var/log directory. What would be the best way to grant an access to logs for PHP application? Let's assume that log files can be really large, like hundreds of megabytes. I have some ideas: Write a shell script that would be run via sudo and tail last 512 Kb of log into a separate file that can be read by application - that's ineffective, because of forking a new process and having to read data twice Add www-data to adm group (that can read logs) - that's insecure Start a PHP process via cron every minute to read logs — that's not very good, because it doesn't allow real-time monitoring. Also, this script will be started even when I don't read logs, and consume CPU time (server is in the cloud, and I'll have to pay for it) Create a hardlink for all log files with lowered permissions - I guess, that won't work because logrotate could recreate log files and they'll change inode number. Start a separate nginx/Apache server under privileged user that may read logs. Maybe anyone got a better solution?

    Read the article

  • BYOD (accessing files) on a domain without joining?

    - by Philip White
    I run a Samba 4 instance at a small private school. This makes a regular Linux server appear as a directory controller. There are two relevant benefits to this: I have a Samba share for people's documents, and I use the Redirected Folders feature to allow any employee to sit down at any PC, log in with their domain credentials, and their My Documents points to network storage. Everyone has a mapped drive (using Group Policy Preferences) to a share specific to their account type. Students can access one share (one share for all students), teachers have another, and office staff have another. However, I would like to allow BYOD (Bring Your Own Device). Some employees are already asking for it with their personal laptops, and I know eventually most everyone will want to. Is there any way to replicate the two features above without having to join PCs to the domain? Joining personal PCs is impractical if only because only professional editions of Windows support this. Ideally, any operating system (including mobile) could access the relevant shares, but of course Windows is key. Offline caching is optional. (I could set up OpenVPN for teachers who want to access their files from home.) The problem with simply giving SSH access to the relevant shares is primarily that Samba 4 relies on ext4 ACLs and ext4 extended attributes to maintain NTFS permissions. Writing files directly to the Linux server would bypass this and would (probably) not be interoperable with Samba4. Right now I am completely flexible. I am even fine with scrapping the whole domain and using some other software for the two features above. How can I allow school employees and students freedom to securely share files without requiring everyone to have specific editions of Windows?

    Read the article

  • Reduce number of config files to as few as possible

    - by Scott
    For most of my applications I use iBatis.Net for database access/modeling and log4Net for logging. In doing this, I need a number of *.config files for each project. For example, for a simple application I need to have the following *.config files: app.config ([AssemblyName].[Extention].config) [AssemblyName].SqlMap.config [AssemblyName].log4Net.config [AssemblyName].SqlMapProperties.config providers.config When these applications go from DEV to TEST to PRODUCTION environments, the settings contained in these files change depending on the environment. When the number of files get compounded by having 5-10 (or more) supporting executables per project, the work load on the infrastructure team (the ones doing the roll-outs to the different environments) gets rather high. We also have a high risk of one of the config files being missed, or a mistype in the config file. What is the best way to avoid these risks? Should I combine all of the config files into one file? (is that possible with iBatis?) I know that with VisualStudio 2010 they introduce transforms for these config files that allow the developer to setup all the settings for the different environments and then dynamically (depending on the build kicked off) the config files get updated to the correct versions. (VS 2010 - transforms) Thank you for any help that you can provide.

    Read the article

  • (database design):Which tables should be created for all kindes files (images, attached email files,

    - by meyosef
    Hi, I new in database design: I have question with my own few solution,what do you thinks?: Which tables should be created for all kinds files (images, attached email files,text files for store email body, etc..) that stored in my online store? *option 1:use seperate table for files types * files{ id files_types_id FK file_path file_extension } files_types{ id type_name (unique) } *option 2: use bool field for each file type * files{ id file_path file_extension is_image_main is_image_icon is_image_logo is_pdf_file is_text_file } **option 3: use 1 enum field 'file_type' for each file type ** files{ id file_path file_extension file_type (image_main,image_icon,image_logo,image_main,pdf,text) **enum** } Thanks you, Yosef

    Read the article

  • Uploading multiple files using Spring MVC 3.0.2 after HiddenHttpMethodFilter has been enabled

    - by Tiny
    I'm using Spring version 3.0.2. I need to upload multiple files using the multiple="multiple" attribute of a file browser such as, <input type="file" id="myFile" name="myFile" multiple="multiple"/> (and not using multiple file browsers something like the one stated by this answer, it indeed works I tried). Although no versions of Internet Explorer supports this approach unless an appropriate jQuery plugin/widget is used, I don't care about it right now (since most other browsers support this). This works fine with commons fileupload but in addition to using RequestMethod.POST and RequestMethod.GET methods, I also want to use other request methods supported and suggested by Spring like RequestMethod.PUT and RequestMethod.DELETE in their own appropriate places. For this to be so, I have configured Spring with HiddenHttpMethodFilter which goes fine as this question indicates. but it can upload only one file at a time even though multiple files in the file browser are chosen. In the Spring controller class, a method is mapped as follows. @RequestMapping(method={RequestMethod.POST}, value={"admin_side/Temp"}) public String onSubmit(@RequestParam("myFile") List<MultipartFile> files, @ModelAttribute("tempBean") TempBean tempBean, BindingResult error, Map model, HttpServletRequest request, HttpServletResponse response) throws IOException, FileUploadException { for(MultipartFile file:files) { System.out.println(file.getOriginalFilename()); } } Even with the request parameter @RequestParam("myFile") List<MultipartFile> files which is a List of type MultipartFile (it can always have only one file at a time). I could find a strategy which is likely to work with multiple files on this blog. I have gone through it carefully. The solution below the section SOLUTION 2 – USE THE RAW REQUEST says, If however the client insists on using the same form input name such as ‘files[]‘ or ‘files’ and then populating that name with multiple files then a small hack is necessary as follows. As noted above Spring 2.5 throws an exception if it detects the same form input name of type file more than once. CommonsFileUploadSupport – the class which throws that exception is not final and the method which throws that exception is protected so using the wonders of inheritance and subclassing one can simply fix/modify the logic a little bit as follows. The change I’ve made is literally one word representing one method invocation which enables us to have multiple files incoming under the same form input name. It attempts to override the method protected MultipartParsingResult parseFileItems(List fileItems, String encoding) {} of the abstract class CommonsFileUploadSupport by extending the class CommonsMultipartResolver such as, package multipartResolver; import java.io.UnsupportedEncodingException; import java.util.HashMap; import java.util.Iterator; import java.util.List; import java.util.Map; import javax.servlet.ServletContext; import org.apache.commons.fileupload.FileItem; import org.springframework.util.StringUtils; import org.springframework.web.multipart.MultipartException; import org.springframework.web.multipart.MultipartFile; import org.springframework.web.multipart.commons.CommonsMultipartFile; import org.springframework.web.multipart.commons.CommonsMultipartResolver; final public class MultiCommonsMultipartResolver extends CommonsMultipartResolver { public MultiCommonsMultipartResolver() { } public MultiCommonsMultipartResolver(ServletContext servletContext) { super(servletContext); } @Override @SuppressWarnings("unchecked") protected MultipartParsingResult parseFileItems(List fileItems, String encoding) { Map<String, MultipartFile> multipartFiles = new HashMap<String, MultipartFile>(); Map multipartParameters = new HashMap(); // Extract multipart files and multipart parameters. for (Iterator it = fileItems.iterator(); it.hasNext();) { FileItem fileItem = (FileItem) it.next(); if (fileItem.isFormField()) { String value = null; if (encoding != null) { try { value = fileItem.getString(encoding); } catch (UnsupportedEncodingException ex) { if (logger.isWarnEnabled()) { logger.warn("Could not decode multipart item '" + fileItem.getFieldName() + "' with encoding '" + encoding + "': using platform default"); } value = fileItem.getString(); } } else { value = fileItem.getString(); } String[] curParam = (String[]) multipartParameters.get(fileItem.getFieldName()); if (curParam == null) { // simple form field multipartParameters.put(fileItem.getFieldName(), new String[] { value }); } else { // array of simple form fields String[] newParam = StringUtils.addStringToArray(curParam, value); multipartParameters.put(fileItem.getFieldName(), newParam); } } else { // multipart file field CommonsMultipartFile file = new CommonsMultipartFile(fileItem); if (multipartFiles.put(fileItem.getName(), file) != null) { throw new MultipartException("Multiple files for field name [" + file.getName() + "] found - not supported by MultipartResolver"); } if (logger.isDebugEnabled()) { logger.debug("Found multipart file [" + file.getName() + "] of size " + file.getSize() + " bytes with original filename [" + file.getOriginalFilename() + "], stored " + file.getStorageDescription()); } } } return new MultipartParsingResult(multipartFiles, multipartParameters); } } What happens is that the last line in the method parseFileItems() (the return statement) i.e. return new MultipartParsingResult(multipartFiles, multipartParameters); causes a compile-time error because the first parameter multipartFiles is a type of Map implemented by HashMap but in reality, it requires a parameter of type MultiValueMap<String, MultipartFile> It is a constructor of a static class inside the abstract class CommonsFileUploadSupport, public abstract class CommonsFileUploadSupport { protected static class MultipartParsingResult { public MultipartParsingResult(MultiValueMap<String, MultipartFile> mpFiles, Map<String, String[]> mpParams) { } } } The reason might be - this solution is about the Spring version 2.5 and I'm using the Spring version 3.0.2 which might be inappropriate for this version. I however tried to replace the Map with MultiValueMap in various ways such as the one shown in the following segment of code, MultiValueMap<String, MultipartFile>mul=new LinkedMultiValueMap<String, MultipartFile>(); for(Entry<String, MultipartFile>entry:multipartFiles.entrySet()) { mul.add(entry.getKey(), entry.getValue()); } return new MultipartParsingResult(mul, multipartParameters); but no success. I'm not sure how to replace Map with MultiValueMap and even doing so could work either. After doing this, the browser shows the Http response, HTTP Status 400 - type Status report message description The request sent by the client was syntactically incorrect (). Apache Tomcat/6.0.26 I have tried to shorten the question as possible as I could and I haven't included unnecessary code. How could be made it possible to upload multiple files after Spring has been configured with HiddenHttpMethodFilter? That blog indicates that It is a long standing, high priority bug. If there is no solution regarding the version 3.0.2 (3 or higher) then I have to disable Spring support forever and continue to use commons-fileupolad as suggested by the third solution on that blog omitting the PUT, DELETE and other request methods forever. Just curiously waiting for a solution and/or suggestion. Very little changes to the code in the parseFileItems() method inside the class MultiCommonsMultipartResolver might make it to upload multiple files but I couldn't succeed in my attempts (again with the Spring version 3.0.2 (3 or higher)).

    Read the article

  • Copy all files and folders excluding subversion files and folders on OS X

    - by Michael Prescott
    I'm trying to copy all files and folders from one directory to another, but exclude certain files. Specifically, I want to exclude subversion files and folders. However, I'd like a general yet concise solution. I imagine I'll find the need to exclude several types of files in the near future. For example, I might want to exclude .svn, *.bak, and *.prj. Here is what I've put together so for, but it is not working for me. The first part, find works, but I'm doing something wrong with xargs and cp. I tried cp with and without the -R. Also, I'm using OS X and it appears to have a less featured version of xargs than linux systems. find ./sourcedirectory -not \( -name .svn -a -prune \) | xargs -IFILES cp -R FILES ./destinationdirectory

    Read the article

  • nginx static files caching doesn't work

    - by user74344
    here is my conf file: usr/local/nginx/sites-available/default server { listen 80; server_name localhost; location / { root html; index index.php index.html index.htm; } # redirect server error pages to the static page /50x.html error_page 500 502 503 504 /50x.html; location = /50x.html { root html; } # serve static files directly location ~* ^.+.(jpg|jpeg|gif|css|png|js|ico|swf)$ { expires 30d; } but it doesn't cache static files, how should I fix it? thanks a lot

    Read the article

  • Changing Network Path of Offline Files

    - by Adam
    Many of our users have their Home folder set as Available Offline. Their Windows 7 laptops will not be back on our network for a few weeks. In the mean time, we're setting up new servers and reorganizing our files, so the network path to the Home folder is going to be completely different. Based on some testing I did, when the users return, any files they've created or modified while offline will be gone, and the new Home folder will be there and not set to sync. The offline cache of the old Home folder is still accessible through the Sync Center, but they're not going to want to dig through that and try to find what's missing. Avoiding this would involve keeping the old server around and moving everyone to the new location in person, so we know for sure they're synced first. Is there any way to avoid this that isn't as tedious, like a quick registry edit or something that will point the old offline cache to the new location?

    Read the article

  • How to copy files pointed to (not the shortcut files themselves)

    - by Ivo Bosticky
    I have a folder containing shortcuts that point to files that are located in various directories and drives. I would like to copy the files pointed to (NOT the shortcut files themselves) to a single destination folder. Is there a way in windows (XP, Vista, 7), file manager, or some utility I can use to do this? I've heard you can do this with various multi-step custom scripts. However, I've heard rumors there is a one click way to do this without having to fabricate a custom script each time, where regardless where the shortcuts point to, I can select the group of shortcuts and do a copy operation that will grab the files they point to. Then, I can paste or otherwise put the actual files (not shortcuts) into one directory. It would be very time consuming to manually find each file pointed to by a shortcut and one by one copy them to the target folder. Note that I've seen this question asked before on the internet but haven't seen a good answer.

    Read the article

  • Offline Files (CSC) on Windows RT

    - by Aeyoun
    Windows RT does not have the Offline Files service. The Sync Center is also gone. Can it be enabled somehow? or can anyone recommend a replacement? My options are very limited on Windows 8.1 RT. The only thing it seems Microsoft is offering is something called Work Folders. These are only supported in Windows 8.1. I really want a more generic solution so that I can access files on OS X and Linux (like a samba share).

    Read the article

  • PNG files are not being displayed in IE and Vista's Sidebar

    - by Wbdvlpr
    Hi, I am using Windows Vista. I was just visiting a website and found some "missing" images boxes on webpages. And, I could see this for many of the websites I visited. Then I realised that only a certain type of image files are not being displayed, which is PNG. I restarted my computer and noticed that 2 of the sidebar gadgets were missing background images. The websites with "missing" images are working fine in Firefox though. So its a problem related to IE and some of the Windows files. Any ideas how do I get PNGs working in my IE and Sidebar etc. Thanks.

    Read the article

  • how can i use apache log files to recreate usage scenario

    - by daigorocub
    Recently i installed a website that had too many requests and it was too slow. Many improvements have been made to the web site code and we've also bought a new server. I want to test the new server with exactly the same requests that made the old server slow. After that, i will double the requests, make new tests and so on. These requests are logged in the apache log files. So, I can parse those files and make some kind of script to make the same requests. Of course, in this case, the requests will be made only by my computer against the server, but hey, better than nothing. Questions: - is there some app that does this already? - would you use wget? ab? python script? Thanks!

    Read the article

  • Unable to see hidden files

    - by TheGoodUser-Sp
    I have BitDefender (With last update) on my Windows-7. I want to see hidden files, so from Tools > Folder Options > View , I change the settings as below, and click on OK: But I can't see hidden files. when I double check the Options, I see the settings changed automatically as below : I know this is a virus-like application! checking by a virus total via ProcessExplorer didn't help : How can I understand with process is related to this issue?

    Read the article

  • Unable to delete files in Temporary Internet Files folder

    - by Johnny
    I'm on Win7. I have a large number of of large .bin files, totaling 183GB, in my Temporary Internet Files folder. They all seem to come from video sharing sites like youtube. The files are invisible in Explorer even after allowing viewing of hidden files. The only way I can see them is by issuing "dir /fs" on the command line. Now when I try to delete them from the command line nothing happens. Trying to delete the whole folder from Explorer results in access denied because another process is using a file in the folder (IE is not running while I'm doing this). Trying to clear the folder using IE is also unsuccessful. How do I delete these files? How did they end up being there without being deleted by IE?

    Read the article

  • Disk fragmentation when dealing with many small files

    - by Zorlack
    On a daily basis we generate about 3.4 Million small jpeg files. We also delete about 3.4 Million 90 day old images. To date, we've dealt with this content by storing the images in a hierarchical manner. The heriarchy is something like this: /Year/Month/Day/Source/ This heirarchy allows us to effectively delete days worth of content across all sources. The files are stored on a Windows 2003 server connected to a 14 disk SATA RAID6. We've started having significant performance issues when writing-to and reading-from the disks. This may be due to the performance of the hardware, but I suspect that disk fragmentation bay be a culprit at well. Some people have recommended storing the data in a database, but I've been hesitant to do this. An other thought was to use some sort of container file, like a VHD or something. Does anyone have any advice for mitigating this kind of fragmentation?

    Read the article

  • Storing source files outside project file directory in Visual Studio C++ 2009

    - by Skurmedel
    Visual Studio projects assumes all files belonging to the project are situated in the same directory as the project file, or one underneath it. For a particular project (in the non-Visual Studio sense) this is not what I want. I want to store the MSVC-specific files in another folder, because there might be other ways to build the application as well, for example with SCons. Also all the stuff MSVC splurts out clutters the source directory. Example: /source /scons /msvc <- here is where I want my MSVC-specific stuff I can add the files, in Explorer, to the source directory manually, and then link them in Visual Studio with the project. It's not the end of the world, but it annoys me a bit that Visual Studio tries to dictate the folder structure of my project. I was looking through the schemas for the project files but realized that this annoying assumption is in the IDE and not the format of the project files. Do someone know a neater way to solve this than manually linking files to the project from the source directory?

    Read the article

  • is a mysql temporary table unique for each user accessing the script that creates it...?

    - by SpikETidE
    Hi everyone... While looking for a way to temporarily save the search results when a user searches for a hotel free between particular dates i came across temporary tables. But certain questions are not answered even in mysql manual.... like... Will the temporary table be unique for each user that executes the script...? Or will it be overwritten when two different users run the script at the same time...? When will the table be destroyed..? When the user closes the browser window or just navigates away from the page in which the script runs...? Thanks for your clarifications...

    Read the article

  • postfix: Temporary lookup failure for FQDN

    - by Thufir
    I'm using the FQDN of dur.bounceme.net which I want to resolve(?) to localhost. That is, I want mail to [email protected] to get delivered to user@localhost. I've tried following the Ubuntu guide on this and seem to be going in circles a bit. root@dur:~# root@dur:~# postfix stop postfix/postfix-script: stopping the Postfix mail system root@dur:~# postfix start postfix/postfix-script: starting the Postfix mail system root@dur:~# telnet dur.bounceme.net 25 Trying 127.0.1.1... telnet: Unable to connect to remote host: Connection refused root@dur:~# root@dur:~# telnet localhost 25 Trying 127.0.0.1... Connected to localhost. Escape character is '^]'. 220 dur.bounceme.net ESMTP Postfix (Ubuntu) ehlo dur 250-dur.bounceme.net 250-PIPELINING 250-SIZE 10240000 250-VRFY 250-ETRN 250-STARTTLS 250-ENHANCEDSTATUSCODES 250-8BITMIME 250 DSN mail from:[email protected] 250 2.1.0 Ok rcpt to:[email protected] 451 4.3.0 <[email protected]>: Temporary lookup failure rcpt to:thufir@localhost 451 4.3.0 <thufir@localhost>: Temporary lookup failure quit 221 2.0.0 Bye Connection closed by foreign host. root@dur:~# root@dur:~# grep telnet /var/log/mail.log Aug 28 00:24:45 dur postfix/smtpd[18256]: NOQUEUE: reject: RCPT from localhost[127.0.0.1]: 451 4.3.0 <thufir@localhost>: Temporary lookup failure; from=<[email protected]> to=<thufir@localhost> proto=ESMTP helo=<dur> Aug 28 00:24:58 dur postfix/smtpd[18256]: NOQUEUE: reject: RCPT from localhost[127.0.0.1]: 451 4.3.0 <[email protected]>: Temporary lookup failure; from=<[email protected]> to=<[email protected]> proto=ESMTP helo=<dur> Aug 28 00:54:55 dur postfix/smtpd[18825]: NOQUEUE: reject: RCPT from localhost[127.0.0.1]: 451 4.3.0 <[email protected]>: Temporary lookup failure; from=<[email protected]> to=<[email protected]> proto=ESMTP helo=<dur> Aug 28 00:55:08 dur postfix/smtpd[18825]: NOQUEUE: reject: RCPT from localhost[127.0.0.1]: 451 4.3.0 <thufir@localhost>: Temporary lookup failure; from=<[email protected]> to=<thufir@localhost> proto=ESMTP helo=<dur> root@dur:~# root@dur:~# postconf -n alias_database = hash:/etc/aliases alias_maps = hash:/etc/aliases, hash:/var/lib/mailman/data/aliases append_dot_mydomain = no biff = no broken_sasl_auth_clients = yes config_directory = /etc/postfix default_transport = smtp home_mailbox = Maildir/ inet_interfaces = loopback-only mailbox_command = /usr/lib/dovecot/deliver -c /etc/dovecot/conf.d/01-mail-stack-delivery.conf -m "${EXTENSION}" mailbox_size_limit = 0 mailman_destination_recipient_limit = 1 mydestination = dur, dur.bounceme.net, localhost.bounceme.net, localhost myhostname = dur.bounceme.net mynetworks = 127.0.0.0/8 [::ffff:127.0.0.0]/104 [::1]/128 readme_directory = no recipient_delimiter = + relay_domains = lists.dur.bounceme.net relay_transport = relay relayhost = smtp_tls_session_cache_database = btree:${data_directory}/smtp_scache smtp_use_tls = yes smtpd_banner = $myhostname ESMTP $mail_name (Ubuntu) smtpd_recipient_restrictions = reject_unknown_sender_domain, reject_unknown_recipient_domain, reject_unauth_pipelining, permit_mynetworks, permit_sasl_authenticated, reject_unauth_destination smtpd_sasl_auth_enable = yes smtpd_sasl_authenticated_header = yes smtpd_sasl_local_domain = $myhostname smtpd_sasl_path = private/dovecot-auth smtpd_sasl_security_options = noanonymous smtpd_sasl_type = dovecot smtpd_tls_auth_only = yes smtpd_tls_cert_file = /etc/ssl/certs/ssl-mail.pem smtpd_tls_key_file = /etc/ssl/private/ssl-mail.key smtpd_tls_mandatory_ciphers = medium smtpd_tls_mandatory_protocols = SSLv3, TLSv1 smtpd_tls_received_header = yes smtpd_tls_session_cache_database = btree:${data_directory}/smtpd_scache smtpd_use_tls = yes tls_random_source = dev:/dev/urandom transport_maps = hash:/etc/postfix/transport root@dur:~#

    Read the article

  • Expire Files In A Folder: Delete Files After x Days

    - by Brett G
    I'm looking to make a "Drop Folder" in a windows shared drive that is accessible to everyone. I'd like files to be deleted automagically if they sit in the folder for more than X days. However, it seems like all methods I've found to do this, use the last modified date, last access time, or creation date of a file. I'm trying to make this a folder that a user can drop files in to share with somebody. If someone copies or moves files into here, I'd like the clock to start ticking at this point. However, the last modified date and creation date of a file will not be updated unless someone actually modifies the file. The last access time is updated too frequently... it seems that just opening a directory in windows explorer will update the last access time. Anyone know of a solution to this? I'm thinking that cataloging the hash of files on a daily basis and then expiring files based on hashes older than a certain date might be a solution.... but taking hashes of files can be time consuming. Any ideas would be greatly appreciated! Note: I've already looked at quite a lot of answers on here... looked into File Server Resource Monitor, powershell scripts, batch scripts, etc. They still use the last access time, last modified time or creation time... which, as described, do not fit the above needs.

    Read the article

  • undelete big files - mission impossible?

    - by johnrembo
    Hi, I've accidentaly deleted outlook.pst (6.7GB) file, while there was only 400MB free space left on primary NTFS partition (winxp). I've tried several recovery tools to get this file back. "Ontrack Easy Recovery Pro" found 0 pst files (complete scan mode), while "Recover My Files" in sector scan mode found 5 pst's, but 4 of them of sizes from 3 to 28 KB, while the 5th one - 1Gb. I've managed to succesfuly recover 1Gb pst file, which was 1 year old copy (the one used after the latest windows reinstall). Now, I'm frustrated and confused Why 1 year old file was succesfuly recovered if there were only 400MB left on primary partition? Where's 6.7GB file gone? I did some reading (i.e. here), and it seems that there's almost no probability to retrieve the file I'm looking for, but wait - none of recovery tools i've used found zero-sized pst file, moreover - if due to fragmentation a file might be corrupted - we could use scanpst.exe to fix some errors and survive with 10 or 100 emails missing - whatever. Could you please recommend some more sophisticated recovery tools for this particular task? Appretiate your help - thanks in advance

    Read the article

  • Disk fragmentation when dealing with many small files

    - by Zorlack
    On a daily basis we generate about 3.4 Million small jpeg files. We also delete about 3.4 Million 90 day old images. To date, we've dealt with this content by storing the images in a hierarchical manner. The heriarchy is something like this: /Year/Month/Day/Source/ This heirarchy allows us to effectively delete days worth of content across all sources. The files are stored on a Windows 2003 server connected to a 14 disk SATA RAID6. We've started having significant performance issues when writing-to and reading-from the disks. This may be due to the performance of the hardware, but I suspect that disk fragmentation may be a culprit at well. Some people have recommended storing the data in a database, but I've been hesitant to do this. An other thought was to use some sort of container file, like a VHD or something. Does anyone have any advice for mitigating this kind of fragmentation? Additional Info: The average file size is 8-14KB Format information from fsutil: NTFS Volume Serial Number : 0x2ae2ea00e2e9d05d Version : 3.1 Number Sectors : 0x00000001e847ffff Total Clusters : 0x000000003d08ffff Free Clusters : 0x000000001c1a4df0 Total Reserved : 0x0000000000000000 Bytes Per Sector : 512 Bytes Per Cluster : 4096 Bytes Per FileRecord Segment : 1024 Clusters Per FileRecord Segment : 0 Mft Valid Data Length : 0x000000208f020000 Mft Start Lcn : 0x00000000000c0000 Mft2 Start Lcn : 0x000000001e847fff Mft Zone Start : 0x0000000002163b20 Mft Zone End : 0x0000000007ad2000

    Read the article

  • Part 7: EBS Modifications and Flagged Files in R12

    - by volker.eckardt(at)oracle.com
    Let me, based on my previous blog, explain the procedure of flagged files a bit better and facilitate the same with screenshots. Flagged files is a concept within the Oracle eBusiness Suite (EBS) release 12, where you flag a standard deployment file, let’s say a Forms file, a Package or a Java class file. When you run the patch analyse, the list of flagged files will be checked and in case one of these files gets patched, the analyse report will tell you. Note: This functionality is also available in release 11, here it is implemented and known as “applcust.txt”. You can flag as many files as you want, in whatever relationship they are with your customizations. In addition to the flag itself you can add a comment. You should use this comment to point to your customization reference (here XXAR_RPT_066 or XXAP_CUST_030). Consider the following two cases: You have created your own report, based on a standard report. In this case you will flag the report file itself, and the key views used. When a patch updates one of these files, you will be informed and can initiate a proper review and testing. (ex.: first line for ARXCTA.rdf) You have created an extensive personalization and because it is business critical you like to be informed if the page definition gets updated. In this case you register the PG.xml file as flagged file. (ex.: second line below for CreateExtBankAcctPG.xml) The menu path to register flagged files is the following: (R) System Administrator > (M) Oracle Applications Manager > Site Map > Maintenance > Register Flagged Files     Your DBA should now run the Patch Analyse every time he is going to apply a new patch. (R) System Administrator > (M) Oracle Applications Manager > Patch Wizard > Task “Recommend/Analyze Patches” The screenshot above shows the impact summary. For this blog entry the number “2” titled “Flagged Files Changed“ is in our focus. When you click the “2” you will get a similar screen like the first in this blog, showing you exactly the files which will get patched if you continue and apply this patch in this environment right now. Note: It is also shown that just 20% of all patch files will get applied. This situation might be different in case your environments are on a different patch level. For sure also the customization impact might then be different. The flagging step can be done directly in the Oracle Applications Manager.  Our developers are responsible for. To transport such a flag+comment we use a FNDLOAD script. It is suggested to put the flagged files data file directly into your CEMLI patch. Herewith the flagged files registration will be executed right at the same time when the patch gets applied. Process Steps: Developer: Builds CEMLI Reviews code and identifies key standard objects referenced Determines standard object files and flags them Creates FNDLOAD file and adds the same to the CEMLI patch DBA: Executes for every new Oracle standard patch the patch analyse in a representative environment Checks and retrieves the flagged files and comments Sends flagged file list back to development team for analyse / retest Developer: Analyses / Updates / Retests effected CEMLIs Prerequisite: The patch analyse has to be executed in an environment where flagged files have been registered. (If you run the patch analyse in a vanilla or outdated environment (compared to your PROD), the analyse will not be so helpful!) When to start with Flagged files? Start right now utilizing this feature. It is an invest to improve the production stability and fulfil your SLA!   Summary Flagged Files is a very helpful EBS R12 technique when analysing patches. Implement a procedure within your development process to maintain such flags. Let the DBA run the patch analyse in an environment with a similar patch and customization level as your current production.   Related Links: EBS Patching Procedures - Chapter 2-13 - Registered Flagged Files

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >