Search Results

Search found 40873 results on 1635 pages for 'split files'.

Page 12/1635 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • GNU Screen: Combining split-regions and full-screen sessions

    - by scrrr
    Let's say I have three sessions: 0, 1 and 2 I'm on session 0 and I press CTRL-A S to split the screen. Then I select session 1 for the bottom split region, while 0 is in the upper. Can I switch to session 2 and have it display in full-screen while 0 and 1 remain split? If I CTRL-A n to other sessions in a split screen it only changes the split-region. I want some sessions to be full-screen though. Is that possible?

    Read the article

  • Splitting Multiple Files in Windows

    - by Justin Boucher
    We have a 21TB LUN full of images that are approx 600K in size in multiple sub folders on the disk. We are trying to split the 21TB LUN into 8 smaller LUNs that are about 2.6TB a piece in order to process the images more effectively. My question is how we can determine what 2.6TB is on the drive? What is the best tool to mark this data so we can copy it to the new smaller LUNs with robocopy or emcopy without overfilling the smaller LUNs? Is there a third-party tool that would be better suited for this task? Thank you in advance for your assistance.

    Read the article

  • split a textfile after each n matches to a new file using sed or awk

    - by ozz
    i tried to split a file in parts of n matches each. The file is just one line and the seperator is '<br>' foo<br>bar<br>.....<br> I just want to split the file in parts, where each file has 100 datasets (text plus <br>)( normaly 100 datasets, but at the end maybe less) I already played around with this ... split-file-in-2-with-sed and this split-one-file-into-multiple-files-based-on-pattern sed.exe -e "^.*.<br>{0,100}/g" < original.txt > first_half.txt The split do not work an the result is only 1 file instead of many.

    Read the article

  • Expire Files In A Folder: Delete Files After x Days

    - by Brett G
    I'm looking to make a "Drop Folder" in a windows shared drive that is accessible to everyone. I'd like files to be deleted automagically if they sit in the folder for more than X days. However, it seems like all methods I've found to do this, use the last modified date, last access time, or creation date of a file. I'm trying to make this a folder that a user can drop files in to share with somebody. If someone copies or moves files into here, I'd like the clock to start ticking at this point. However, the last modified date and creation date of a file will not be updated unless someone actually modifies the file. The last access time is updated too frequently... it seems that just opening a directory in windows explorer will update the last access time. Anyone know of a solution to this? I'm thinking that cataloging the hash of files on a daily basis and then expiring files based on hashes older than a certain date might be a solution.... but taking hashes of files can be time consuming. Any ideas would be greatly appreciated! Note: I've already looked at quite a lot of answers on here... looked into File Server Resource Monitor, powershell scripts, batch scripts, etc. They still use the last access time, last modified time or creation time... which, as described, do not fit the above needs.

    Read the article

  • undelete big files - mission impossible?

    - by johnrembo
    Hi, I've accidentaly deleted outlook.pst (6.7GB) file, while there was only 400MB free space left on primary NTFS partition (winxp). I've tried several recovery tools to get this file back. "Ontrack Easy Recovery Pro" found 0 pst files (complete scan mode), while "Recover My Files" in sector scan mode found 5 pst's, but 4 of them of sizes from 3 to 28 KB, while the 5th one - 1Gb. I've managed to succesfuly recover 1Gb pst file, which was 1 year old copy (the one used after the latest windows reinstall). Now, I'm frustrated and confused Why 1 year old file was succesfuly recovered if there were only 400MB left on primary partition? Where's 6.7GB file gone? I did some reading (i.e. here), and it seems that there's almost no probability to retrieve the file I'm looking for, but wait - none of recovery tools i've used found zero-sized pst file, moreover - if due to fragmentation a file might be corrupted - we could use scanpst.exe to fix some errors and survive with 10 or 100 emails missing - whatever. Could you please recommend some more sophisticated recovery tools for this particular task? Appretiate your help - thanks in advance

    Read the article

  • Disk fragmentation when dealing with many small files

    - by Zorlack
    On a daily basis we generate about 3.4 Million small jpeg files. We also delete about 3.4 Million 90 day old images. To date, we've dealt with this content by storing the images in a hierarchical manner. The heriarchy is something like this: /Year/Month/Day/Source/ This heirarchy allows us to effectively delete days worth of content across all sources. The files are stored on a Windows 2003 server connected to a 14 disk SATA RAID6. We've started having significant performance issues when writing-to and reading-from the disks. This may be due to the performance of the hardware, but I suspect that disk fragmentation may be a culprit at well. Some people have recommended storing the data in a database, but I've been hesitant to do this. An other thought was to use some sort of container file, like a VHD or something. Does anyone have any advice for mitigating this kind of fragmentation? Additional Info: The average file size is 8-14KB Format information from fsutil: NTFS Volume Serial Number : 0x2ae2ea00e2e9d05d Version : 3.1 Number Sectors : 0x00000001e847ffff Total Clusters : 0x000000003d08ffff Free Clusters : 0x000000001c1a4df0 Total Reserved : 0x0000000000000000 Bytes Per Sector : 512 Bytes Per Cluster : 4096 Bytes Per FileRecord Segment : 1024 Clusters Per FileRecord Segment : 0 Mft Valid Data Length : 0x000000208f020000 Mft Start Lcn : 0x00000000000c0000 Mft2 Start Lcn : 0x000000001e847fff Mft Zone Start : 0x0000000002163b20 Mft Zone End : 0x0000000007ad2000

    Read the article

  • Delphi - How do I split a string into an array of strings based on a delimiter?

    - by Ryan
    Hello all, I'm trying to find a Delphi function that will split an input string into an array of strings based on a delimiter. I've found a lot on Google, but all seem to have their own issues and I haven't been able to get any of them to work. I just need to split a string like: "word:doc,txt,docx" into an array based on ':'. The result would be ['word', 'doc,txt,docx']. Does anyone have a function that they know works? Thank you

    Read the article

  • Part 7: EBS Modifications and Flagged Files in R12

    - by volker.eckardt(at)oracle.com
    Let me, based on my previous blog, explain the procedure of flagged files a bit better and facilitate the same with screenshots. Flagged files is a concept within the Oracle eBusiness Suite (EBS) release 12, where you flag a standard deployment file, let’s say a Forms file, a Package or a Java class file. When you run the patch analyse, the list of flagged files will be checked and in case one of these files gets patched, the analyse report will tell you. Note: This functionality is also available in release 11, here it is implemented and known as “applcust.txt”. You can flag as many files as you want, in whatever relationship they are with your customizations. In addition to the flag itself you can add a comment. You should use this comment to point to your customization reference (here XXAR_RPT_066 or XXAP_CUST_030). Consider the following two cases: You have created your own report, based on a standard report. In this case you will flag the report file itself, and the key views used. When a patch updates one of these files, you will be informed and can initiate a proper review and testing. (ex.: first line for ARXCTA.rdf) You have created an extensive personalization and because it is business critical you like to be informed if the page definition gets updated. In this case you register the PG.xml file as flagged file. (ex.: second line below for CreateExtBankAcctPG.xml) The menu path to register flagged files is the following: (R) System Administrator > (M) Oracle Applications Manager > Site Map > Maintenance > Register Flagged Files     Your DBA should now run the Patch Analyse every time he is going to apply a new patch. (R) System Administrator > (M) Oracle Applications Manager > Patch Wizard > Task “Recommend/Analyze Patches” The screenshot above shows the impact summary. For this blog entry the number “2” titled “Flagged Files Changed“ is in our focus. When you click the “2” you will get a similar screen like the first in this blog, showing you exactly the files which will get patched if you continue and apply this patch in this environment right now. Note: It is also shown that just 20% of all patch files will get applied. This situation might be different in case your environments are on a different patch level. For sure also the customization impact might then be different. The flagging step can be done directly in the Oracle Applications Manager.  Our developers are responsible for. To transport such a flag+comment we use a FNDLOAD script. It is suggested to put the flagged files data file directly into your CEMLI patch. Herewith the flagged files registration will be executed right at the same time when the patch gets applied. Process Steps: Developer: Builds CEMLI Reviews code and identifies key standard objects referenced Determines standard object files and flags them Creates FNDLOAD file and adds the same to the CEMLI patch DBA: Executes for every new Oracle standard patch the patch analyse in a representative environment Checks and retrieves the flagged files and comments Sends flagged file list back to development team for analyse / retest Developer: Analyses / Updates / Retests effected CEMLIs Prerequisite: The patch analyse has to be executed in an environment where flagged files have been registered. (If you run the patch analyse in a vanilla or outdated environment (compared to your PROD), the analyse will not be so helpful!) When to start with Flagged files? Start right now utilizing this feature. It is an invest to improve the production stability and fulfil your SLA!   Summary Flagged Files is a very helpful EBS R12 technique when analysing patches. Implement a procedure within your development process to maintain such flags. Let the DBA run the patch analyse in an environment with a similar patch and customization level as your current production.   Related Links: EBS Patching Procedures - Chapter 2-13 - Registered Flagged Files

    Read the article

  • FFMPEG Splitting MP4 with Same Quality

    - by Pragmatic
    I have one large MP4 file. I am attempting to split it into smaller files. ffmpeg -i largefile.mp4 -sameq -ss 00:00:00 -t 00:50:00 smallfile.mp4 I thought using -sameq would keep the same quality settings. However, I must not understand what that does. I'm looking to keep the same quality (audio/video) and compression with the split files. However, this setting makes the split files much larger. What flag(s) do I need to set to keep the same quality and attributes in the split files while maintaining the same quality to size ratio? For instance if my original file is about 12 GB and is 1920x1080 with a bitrate of 10617kbps and a framerate of 23 frames/sec and 6 channel audio with 317kbps, I would like the split files to be the same only a third of this size (if i split it into three pieces).

    Read the article

  • Multi Threading - How to split the tasks

    - by Motig
    if I have a game engine with the basic 'game engine' components, what is the best way to 'split' the tasks with a multi-threaded approach? Assuming I have the standard components of: Rendering Physics Scripts Networking And a quad-core, I see two ways of multi-threading: Option A ('Vertical'): Using this approach I can allow one core for each component of the engine; e.g. one core for the Rendering task, one for the Physics, etc. Advantages: I do not need to worry about thread-safety within each component I can take advantage of special optimizations provided for single-threaded access (e.g. DirectX offers a flag that can be set to tell it that you will only use single-threading) Option B ('Horizontal'): Using this approach, each task may be split up into 1 <= n <= numCores threads, and executed simultaneously, one after the other. Advantages: Allows for work-sharing, i.e. each thread can take over work still remaining as the others are still processing I can take advantage of libraries that are designed for multi-threading (i.e. ... DirectX) I think, in retrospect, I would pick Option B, but I wanted to hear you guys' thoughts on the matter.

    Read the article

  • VMWare Server - Writing files to virtual hard drive performance

    - by Ardman
    We have just moved our infrastructure from physical servers to virtual machines. Everything is running great and we are happy with the result of the move. We have identified one problem, and that is reading/writing performance. We have an application that compiles files and writes to disk. This is considerably slower on the new virtual machines compared to the physical machines. Is there a performance bottleneck when writing to a virtual hard drive compared to a physical hard drive?

    Read the article

  • Where can I find WebSphere configuration files?

    - by Nicholas Key
    Hi there, I would like to know where are the WebSphere configuration details saved? Specifically, configuration details that are shown in the Administrative Console (from the web) or from the console using wsadmin. Some of the examples would be: Java and Process Management: Class loader, Process definition, Process execution Container Settings: Session management, SIP Container Settings, Web Container Settings, Portlet Container Settings Are there XML files that persist these configuration details? Nicholas

    Read the article

  • Apache - Serving static files from different subdomain + machine

    - by rubayeet
    Here's the scenario A site is running on this domain - www.someserver.com I'm going to host subdomain.someserver.com on my machine. Let's say all the image files are under the directory 'img'. I don't want to copy all their images to my machine. So what should be the Apache directive(s) that'll map the request for an image, like http://subdomain.someserver.com/img/image.png to http://www.someserver.com/img/image.png

    Read the article

  • VMWare - Writing files to virtual hard drive performance

    - by Ardman
    We have just moved our infrastructure from physical servers to virtual machines. Everything is running great and we are happy with the result of the move. We have identified one problem, and that is reading/writing performance. We have an application that compiles files and writes to disk. This is considerably slower on the new virtual machines compared to the physical machines. Is there a performance bottleneck when writing to a virtual hard drive compared to a physical hard drive?

    Read the article

  • Best way to manually sort random text files?

    - by Jane
    I have about 1000 text files and I need to view each, and move it to a folder if it's the correct one. I can only do basic sorting by length/size, and I can't grep because the text is random. How can I do this besides manually openiing + saving each in gedit. I'm on Ubuntu Linux. Thanks

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >