Search Results

Search found 58514 results on 2341 pages for 'show all files'.

Page 45/2341 | < Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >

  • How to delete duplicate restored user files with "(2)" added (Win7)

    - by user332172
    How to delete duplicate restored user files with "(2)" added (Win7) I restored my user files on windows 7 system from the Win 7 backup. I selected the wrong restore option and all files were restored. Existing unchanged files were restored with the text string " (2)". Is there a way to write a batchfile or script to do this operation? Example file name: "01 lesson 1" "01 lesson 1 (2)" I want to delete all files which had " (2)" appended on restore.

    Read the article

  • How to Load Oracle Tables From Hadoop Tutorial (Part 5 - Leveraging Parallelism in OSCH)

    - by Bob Hanckel
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Using OSCH: Beyond Hello World In the previous post we discussed a “Hello World” example for OSCH focusing on the mechanics of getting a toy end-to-end example working. In this post we are going to talk about how to make it work for big data loads. We will explain how to optimize an OSCH external table for load, paying particular attention to Oracle’s DOP (degree of parallelism), the number of external table location files we use, and the number of HDFS files that make up the payload. We will provide some rules that serve as best practices when using OSCH. The assumption is that you have read the previous post and have some end to end OSCH external tables working and now you want to ramp up the size of the loads. Using OSCH External Tables for Access and Loading OSCH external tables are no different from any other Oracle external tables.  They can be used to access HDFS content using Oracle SQL: SELECT * FROM my_hdfs_external_table; or use the same SQL access to load a table in Oracle. INSERT INTO my_oracle_table SELECT * FROM my_hdfs_external_table; To speed up the load time, you will want to control the degree of parallelism (i.e. DOP) and add two SQL hints. ALTER SESSION FORCE PARALLEL DML PARALLEL  8; ALTER SESSION FORCE PARALLEL QUERY PARALLEL 8; INSERT /*+ append pq_distribute(my_oracle_table, none) */ INTO my_oracle_table SELECT * FROM my_hdfs_external_table; There are various ways of either hinting at what level of DOP you want to use.  The ALTER SESSION statements above force the issue assuming you (the user of the session) are allowed to assert the DOP (more on that in the next section).  Alternatively you could embed additional parallel hints directly into the INSERT and SELECT clause respectively. /*+ parallel(my_oracle_table,8) *//*+ parallel(my_hdfs_external_table,8) */ Note that the "append" hint lets you load a target table by reserving space above a given "high watermark" in storage and uses Direct Path load.  In other doesn't try to fill blocks that are already allocated and partially filled. It uses unallocated blocks.  It is an optimized way of loading a table without incurring the typical resource overhead associated with run-of-the-mill inserts.  The "pq_distribute" hint in this context unifies the INSERT and SELECT operators to make data flow during a load more efficient. Finally your target Oracle table should be defined with "NOLOGGING" and "PARALLEL" attributes.   The combination of the "NOLOGGING" and use of the "append" hint disables REDO logging, and its overhead.  The "PARALLEL" clause tells Oracle to try to use parallel execution when operating on the target table. Determine Your DOP It might feel natural to build your datasets in Hadoop, then afterwards figure out how to tune the OSCH external table definition, but you should start backwards. You should focus on Oracle database, specifically the DOP you want to use when loading (or accessing) HDFS content using external tables. The DOP in Oracle controls how many PQ slaves are launched in parallel when executing an external table. Typically the DOP is something you want to Oracle to control transparently, but for loading content from Hadoop with OSCH, it's something that you will want to control. Oracle computes the maximum DOP that can be used by an Oracle user. The maximum value that can be assigned is an integer value typically equal to the number of CPUs on your Oracle instances, times the number of cores per CPU, times the number of Oracle instances. For example, suppose you have a RAC environment with 2 Oracle instances. And suppose that each system has 2 CPUs with 32 cores. The maximum DOP would be 128 (i.e. 2*2*32). In point of fact if you are running on a production system, the maximum DOP you are allowed to use will be restricted by the Oracle DBA. This is because using a system maximum DOP can subsume all system resources on Oracle and starve anything else that is executing. Obviously on a production system where resources need to be shared 24x7, this can’t be allowed to happen. The use cases for being able to run OSCH with a maximum DOP are when you have exclusive access to all the resources on an Oracle system. This can be in situations when your are first seeding tables in a new Oracle database, or there is a time where normal activity in the production database can be safely taken off-line for a few hours to free up resources for a big incremental load. Using OSCH on high end machines (specifically Oracle Exadata and Oracle BDA cabled with Infiniband), this mode of operation can load up to 15TB per hour. The bottom line is that you should first figure out what DOP you will be allowed to run with by talking to the DBAs who manage the production system. You then use that number to derive the number of location files, and (optionally) the number of HDFS data files that you want to generate, assuming that is flexible. Rule 1: Find out the maximum DOP you will be allowed to use with OSCH on the target Oracle system Determining the Number of Location Files Let’s assume that the DBA told you that your maximum DOP was 8. You want the number of location files in your external table to be big enough to utilize all 8 PQ slaves, and you want them to represent equally balanced workloads. Remember location files in OSCH are metadata lists of HDFS files and are created using OSCH’s External Table tool. They also represent the workload size given to an individual Oracle PQ slave (i.e. a PQ slave is given one location file to process at a time, and only it will process the contents of the location file.) Rule 2: The size of the workload of a single location file (and the PQ slave that processes it) is the sum of the content size of the HDFS files it lists For example, if a location file lists 5 HDFS files which are each 100GB in size, the workload size for that location file is 500GB. The number of location files that you generate is something you control by providing a number as input to OSCH’s External Table tool. Rule 3: The number of location files chosen should be a small multiple of the DOP Each location file represents one workload for one PQ slave. So the goal is to keep all slaves busy and try to give them equivalent workloads. Obviously if you run with a DOP of 8 but have 5 location files, only five PQ slaves will have something to do and the other three will have nothing to do and will quietly exit. If you run with 9 location files, then the PQ slaves will pick up the first 8 location files, and assuming they have equal work loads, will finish up about the same time. But the first PQ slave to finish its job will then be rescheduled to process the ninth location file, potentially doubling the end to end processing time. So for this DOP using 8, 16, or 32 location files would be a good idea. Determining the Number of HDFS Files Let’s start with the next rule and then explain it: Rule 4: The number of HDFS files should try to be a multiple of the number of location files and try to be relatively the same size In our running example, the DOP is 8. This means that the number of location files should be a small multiple of 8. Remember that each location file represents a list of unique HDFS files to load, and that the sum of the files listed in each location file is a workload for one Oracle PQ slave. The OSCH External Table tool will look in an HDFS directory for a set of HDFS files to load.  It will generate N number of location files (where N is the value you gave to the tool). It will then try to divvy up the HDFS files and do its best to make sure the workload across location files is as balanced as possible. (The tool uses a greedy algorithm that grabs the biggest HDFS file and delegates it to a particular location file. It then looks for the next biggest file and puts in some other location file, and so on). The tools ability to balance is reduced if HDFS file sizes are grossly out of balance or are too few. For example suppose my DOP is 8 and the number of location files is 8. Suppose I have only 8 HDFS files, where one file is 900GB and the others are 100GB. When the tool tries to balance the load it will be forced to put the singleton 900GB into one location file, and put each of the 100GB files in the 7 remaining location files. The load balance skew is 9 to 1. One PQ slave will be working overtime, while the slacker PQ slaves are off enjoying happy hour. If however the total payload (1600 GB) were broken up into smaller HDFS files, the OSCH External Table tool would have an easier time generating a list where each workload for each location file is relatively the same.  Applying Rule 4 above to our DOP of 8, we could divide the workload into160 files that were approximately 10 GB in size.  For this scenario the OSCH External Table tool would populate each location file with 20 HDFS file references, and all location files would have similar workloads (approximately 200GB per location file.) As a rule, when the OSCH External Table tool has to deal with more and smaller files it will be able to create more balanced loads. How small should HDFS files get? Not so small that the HDFS open and close file overhead starts having a substantial impact. For our performance test system (Exadata/BDA with Infiniband), I compared three OSCH loads of 1 TiB. One load had 128 HDFS files living in 64 location files where each HDFS file was about 8GB. I then did the same load with 12800 files where each HDFS file was about 80MB size. The end to end load time was virtually the same. However when I got ridiculously small (i.e. 128000 files at about 8MB per file), it started to make an impact and slow down the load time. What happens if you break rules 3 or 4 above? Nothing draconian, everything will still function. You just won’t be taking full advantage of the generous DOP that was allocated to you by your friendly DBA. The key point of the rules articulated above is this: if you know that HDFS content is ultimately going to be loaded into Oracle using OSCH, it makes sense to chop them up into the right number of files roughly the same size, derived from the DOP that you expect to use for loading. Next Steps So far we have talked about OLH and OSCH as alternative models for loading. That’s not quite the whole story. They can be used together in a way that provides for more efficient OSCH loads and allows one to be more flexible about scheduling on a Hadoop cluster and an Oracle Database to perform load operations. The next lesson will talk about Oracle Data Pump files generated by OLH, and loaded using OSCH. It will also outline the pros and cons of using various load methods.  This will be followed up with a final tutorial lesson focusing on how to optimize OLH and OSCH for use on Oracle's engineered systems: specifically Exadata and the BDA. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

    Read the article

  • Personal Technology – Excel Tip: Comparing Excel Files

    - by Pinal Dave
    This guest post is by Vinod Kumar. Vinod Kumar has worked with SQL Server extensively since joining the industry over a decade ago. Working on various versionsfrom SQL Server 7.0, Oracle 7.3 and other database technologies – he now works with the Microsoft Technology Center (MTC) as a Technology Architect. Let us read the blog post in Vinod’s own voice. I have been writing about Excel Tips over my blog and thought it would be great to share one interesting tips here as a guest blog here. Assume a situation where you want to compare multiple excel files. Here is a typical scenario I have encountered as a common activity. Assume you are sending an Excel file with tons of data, formulae and multiple sheets. Now you are requesting your colleague to validate the file and if required change content for correctness. After receiving the file from your colleague, now you want to know what changes were made by this person to your document. Now here is a cool new addition to Excel 2013 that can help you achieve this task. To get to this option, click the INQUIRE Tab. Incase you don’t have the INQUIRE Tab, check Options using INQUIRE blog. In that post, we discuss all the other options of INQUIRE tab. Once you are on the INQUIRE Tab, select “Compare Files” button as shown in the figure above. This brings a dialog as below. If you are on Windows 8 or Windows 7 OS, search for an application called “Spreadsheet Compare 2013”. Ultimately both the options lead us to the same application. If you are using the stand alone app, once the App initializes, click the “Compare files” options from the toolbar. Make sure to give two different Excel files as shown in the figure above. After selecting the Excel Sheets, you can see the Compare tool has a number of other options to play from. We will talk about some of them later in this post. Just below our toolbar is a colorful side-by-side comparison of both our excel sheets. We can also see the various Tab’s from each file. There is a meaning for each of our color coding which will be discussed next. As you saw above, the color coding has a meaning. For example the bottom pane lists each of the color coding and most importantly each of the changes as compared side-by-side. The detailed information shown below can be exported using the “Export Results” options from the toolbar as a separate Excel Workbook or can be copied to clipboard to be used later. The final piece of the puzzle is to show a graphical view of these differences results based on each category. We cannot drill down per se, but this is a great way to know that the maximum changes seem to be based on “Cell Formats” and then few “Calculated Values” have changed. The INQUIRE option and Spreadsheet Compare 2013 tool is part of Excel 2013. So as you explore using the new version of Excel, there are many such hidden features that are worth exploring. Do let us know if you enjoyed learning a new feature today and I hope you will play around with this feature in your day-today challenges when working with Excel files. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology Tagged: Excel, Personal Technology

    Read the article

  • Archive Recorded Show from DVR

    - by BillN
    IF I have an AT&T U-Verse DVR, is there a way to copy the information off the DVR to an PC for Archiving/burning to a DVD? I know that I could connect the A/V outs on the DVR to the A/V inputs on a DVD Recorder, or to a capture card on a PC, and Play the show, and record in real-time on the DVD or PC, but if the show is say a World Cup Game, this would take 2-3 hours per. My thought is that the DVR is just a HardDrive, and the files have to be stored in an digital format, it should be possible to do a copy of some sort. I'm currently recording all of the WC games, but if I don't watch them right away, I'm concerned that I will push other programming that the rest of the family has recorded out, making me unpopular at home. Note, I'm talking about archiving for personal use, not redistribution or anything.

    Read the article

  • ICQ offline messages (do not show up)

    - by Nils Riedemann
    Hi there, I have a weird problem with offline messages: I use Adium on 2 Macs (home and office) and Beejive (iPhone) for connecting to ICQ and other protocols. When someone sends me an offline messages I expect them to show up the next time I login with any of my 3 devices. But thats not the case. Recently some show up at the office, some at home and some on my iPhone – couldnt figure out any logic behind it, seems rather random. I thought for a while that it depends on the last device i have been logged in with. But thats not the case either. A friend can send me offline Messages during the weekend, i can login with my Mac at home or iPhone, yet in some cases it only shows up when logging in from office. Weird. It really freaks me out – and no, i can't make 'em go to jabber or the like. Anyone experiencing a similar problem?

    Read the article

  • How to make VLC show single FLAC file containing many tracks as separate tracks

    - by Bryce Thomas
    I have a single .flac file that encompasses multiple tracks comprising a music album along with the corresponding .cue file. I'm running Ubuntu 10.04 with VLC player, and am trying to get VLC player to show the individual tracks and allow me to use the previous and next controls to move back and forth between tracks. The problem I'm having is that when I open the single .flac file with VLC, it just shows a single track the duration of the entire album, and I'm unable to skip back and forth between tracks. Is there any way to have VLC show the individual tracks contained within a .flac file without having to pre-split the .flac file into individual track files first?

    Read the article

  • XCP vm created via xl create doesn't show up in Xencenter neither in xe vm-list

    - by user138664
    i am running XCP 1.4.90.530.170661 and i have many PV guests running and created via XenCenter. I have created now an openwrt PV guest via the xl create command but it doesn't show up in Xencenter neither in the vm-list command: xe vm-list uuid ( RO) : 6d29aac1-67ff-f83e-4dbc-894a3b6b9c10 name-label ( RW): slitaz power-state ( RO): running uuid ( RO) : 07d96dd1-8223-cd1c-587d-ae37e48d267b name-label ( RW): xen-centos6 power-state ( RO): running uuid ( RO) : 3164bcf1-e43c-badb-e0cf-5423751fffb9 name-label ( RW): xenwin7 power-state ( RO): running uuid ( RO) : 8a31725e-4bcb-48ac-ba7b-e7c1ba310789 name-label ( RW): Control domain on host: xen-mini power-state ( RO): running xl list Name ID Mem VCPUs State Time(s) Domain-0 0 300 2 r----- 4862.5 xenwin7 2 766 1 -b---- 4933.1 slitaz 10 255 1 -b---- 30.8 xen-centos6 11 767 1 -b---- 46.9 openwrt 12 32 1 -b---- 6.8 are these two different kinds of PV vm? if yes, is there a way to export the "xl create" one as xva/ova and import it to show up like the others? Thanks in advance, /c/

    Read the article

  • show/hide specific scripts from browsers (adblock)

    - by user143822
    i'm using AdBlock Plus and Element Hiding Helper to show/hide dom element. But i don't understand how can i show/hide a specific javascript script from page. For example look this page: http://downloadzoneforum.net We have different div, class: maintitle. Every div maintitle have a spoiler. If click on - minus picture i can close the container. Default div maintitle have spoilers opened. But i want hide some spoilers using a filter. When i open Firefox, i want see hidden this spoilers from Extra Forum and Discussioni Varie Like this: How Can i Do this using Adblock or Element Hiding Helper or Another Solution?

    Read the article

  • Shortcut for "show in folder" in Windows 7

    - by richardh
    I'm new to Windows (former Mac user) and using Windows 7 for about two months now. I almost exclusively use the taskbar to navigate to files (i.e., I press the Win/meta key and start typing... my libraries and naming conventions make it pretty easy to get the correct file). Then I press enter and the file opens. Awesome. But sometimes I want to see the file in its folder (i.e., maybe I want to rename, move, copy, etc.). To do this I need to mouse/trackpad over and right click to get the "show in folder" options. Is there another way short of searching for the folder name instead? Is there a hotkey/shortcut for "show in folder"? Thanks!

    Read the article

  • "type" Command Not Working As Expected on Git Bash

    - by trysis
    The type command, in Linux, returns the location, on the filesystem, of the given file, if it is in the current folder or the $PATH. This functionality is also available through Windows with the Git Bash command line program. The command also returns a file's location given the file without its extension (.exe, .vbs, etc.) However, I have run into what seems like a strange corner case where the file exists on the $PATH but doesn't get returned using the command. I am thinking of buying a new computer soon, so I looked up the method of transferring the license key from one computer to another, in preparation for actually doing this. The method I found mentioned the files slmgr.vbs and slui.exe, both of which reside in the C:/Windows\System32 folder, which is in my $PATH, as usual for a Windows computer. However, these two files aren't showing up when I use the type command. Also, neither gets executed when I call the files as commands without their extensions in Git Bash, and only slmgr.vbs gets executed when I call them with the extensions. Finally, slmgr.vbs is shown when listing the folder's contents in Git Bash, as well, but slui.exe isn't. I thought this might have to do with permissions, and, indeed, both files have very restrictive permissions, as you can see in the pictures below, but they both have the same permissions, which wouldn't explain why one gets executed and the other doesn't when called directly, nor why one file is listed on command line but the other isn't. C:\Windows\System32 folder, proving the files exist: File permissions for the Users and Administrators groups for the two files (they are identical): And the folder: type command and its output in Git Bash for the 2 files, plus listing the files in the folder (using grep to filter as the folder is huge), as well as listing part of the $PATH (keep in mind, for all these, that Git Bash changes the paths as they are displayed): Sean@MYPC ~ $ type -a slmgr sh.exe": type: slmgr: not found Sean@MYPC ~ $ type -a slmgr.vbs sh.exe": type: slmgr.vbs: not found Sean@MYPC ~ $ type -a slui sh.exe": type: slui: not found Sean@MYPC ~ $ type -a slui.exe sh.exe": type: slui.exe: not found Sean@MYPC ~ $ slmgr sh.exe": slmgr: command not found Sean@MYPC ~ $ slmgr.vbs /c/WINDOWS/system32/slmgr.vbs: line 2: syntax error near unexpected token `(' /c/WINDOWS/system32/slmgr.vbs: line 2: `' Copyright (c) Microsoft Corporation. A ll rights reserved.' Sean@MYPC ~ $ slui sh.exe": slui: command not found Sean@MYPC ~ $ slui.exe sh.exe": slui.exe: command not found Sean@MYPC ~ $ ls /c/Windows/System32/slui.exe /c/Windows/System32/slmgr.vbs ls: /c/Windows/System32/slui.exe: No such file or directory /c/Windows/System32/slmgr.vbs Sean@MYPC ~ $ echo $PATH /c/Users/Sean/bin:.:/usr/local/bin:/mingw/bin:/bin:/cmd:/c/Python33/:/c/Program Files (x86)/Intel/iCLS Client/:/c/Program Files/Intel/iCLS Client/:/c/WINDOWS/sy stem32:/c/WINDOWS:/c/WINDOWS/System32/Wbem:/c/WINDOWS/System32/WindowsPowerShell /v1.0/:/c/Program Files/Intel/Intel(R) Management Engine Components/DAL:/c/Progr am Files/Intel/Intel(R) Management Engine Components/IPT:/c/Program Files (x86)/ Intel/Intel(R) Management Engine Components/DAL:/c/Program Files (x86)/Intel/Int el(R) Management Engine Components/IPT:/c/Program Files/Intel/WiFi/bin/:/c/Progr am Files/Common Files/Intel/WirelessCommon/:/c/strawberry/c/bin:/c/strawberry/pe rl/site/bin:/c/strawberry/perl/bin:/c/Program Files (x86)/Microsoft ASP.NET/ASP. NET Web Pages/v1.0/:/c/Program Files/Microsoft SQL Server/110/Tools/Binn/:/c/Pro gram Files (x86)/Microsoft SQL Server/90/Tools/binn/:/c/Program Files (x86)/Open AFS/Common:/c/HashiCorp/Vagrant/bin:/c/Program Files (x86)/Windows Kits/8.1/Wind ows Performance Toolkit/:/c/Program Files/nodejs/:/c/Program Files (x86)/Git/cmd :/c/Program Files (x86)/Git/bin:/c/Program Files/Microsoft/Web Platform Installe r/:/c/Ruby200-x64/bin:/c/Users/Sean/AppData/Local/Box/Box Edit/:/c/Program Files (x86)/SSH Communications Security/SSH Secure Shell:/c/Users/Sean/Documents/Lisp :/c/Program Files/GCL-2.6.1/lib/gcl-2.6.1/unixport:/c/Chocolatey/bin:/c/Users/Se an/AppData/Roaming/npm:/c/wamp/bin/mysql/mysql5.6.12/bin:/c/Program Files/Oracle /VirtualBox:/c/Program Files/Java/jdk1.7.0_51/bin:/c/Program Files/Node-Growl:/c /chocolatey/bin:/c/Program Files/eclipse:/c/MongoDB/bin:/c/Program Files/7-Zip:/ c/Program Files (x86)/Google/Chrome/Application:/c/Program Files (x86)/LibreOffi ce 4/program:/c/Program Files (x86)/OpenOffice 4/program What's happening? Why aren't these files listed with the type command? Is this issue because of weird Windows permissions, or something even weirder? If permissions, why do they seem to have the same permissions, yet both are not handled in the same way?

    Read the article

  • OAF Page to Upload Files into Server from local Machine

    - by PRajkumar
    1. Create a New Workspace and Project File > New > General > Workspace Configured for Oracle Applications File Name – PrajkumarFileUploadDemo   Automatically a new OA Project will also be created   Project Name -- FileUploadDemo Default Package -- prajkumar.oracle.apps.fnd.fileuploaddemo   2. Create a New Application Module (AM) Right Click on FileUploadDemo > New > ADF Business Components > Application Module Name -- FileUploadAM Package -- prajkumar.oracle.apps.fnd.fileuploaddemo.server Check Application Module Class: FileUploadAMImpl Generate JavaFile(s)   3. Create a New Page Right click on FileUploadDemo > New > Web Tier > OA Components > Page Name -- FileUploadPG Package -- prajkumar.oracle.apps.fnd.fileuploaddemo.webui   4. Select the FileUploadPG and go to the strcuture pane where a default region has been created   5. Select region1 and set the following properties --     Attribute Property ID PageLayoutRN AM Definition prajkumar.oracle.apps.fnd.fileuploaddemo.server.FileUploadAM Window Title Uploading File into Server from Local Machine Demo Window Title Uploading File into Server from Local Machine Demo     6. Create Stack Layout Region Under Page Layout Region Right click PageLayoutRN > New > Region   Attribute Property ID MainRN AM Definition messageComponentLayout   7. Create a New Item messageFileUpload Bean under MainRN Right click on MainRN > New > messageFileUpload Set Following Properties for New Item --   Attribute Property ID MessageFileUpload Item Style messageFileUpload   8. Create a New Item Submit Button Bean under MainRN Right click on MainRN > New > messageLayout Set Following Properties for messageLayout --   Attribute Property ID ButtonLayout   Right Click on ButtonLayout > New > Item   Attribute Property ID Submit Item Style submitButton Attribute Set /oracle/apps/fnd/attributesets/Buttons/Go   9. Create Controller for page FileUploadPG Right Click on PageLayoutRN > Set New Controller Package Name: prajkumar.oracle.apps.fnd.fileuploaddemo.webui Class Name: FileUploadCO   Write Following Code in FileUploadCO processFormRequest   import oracle.cabo.ui.data.DataObject; import java.io.FileOutputStream; import java.io.InputStream; import oracle.jbo.domain.BlobDomain; import java.io.File; import oracle.apps.fnd.framework.OAException; public void processFormRequest(OAPageContext pageContext, OAWebBean webBean) { super.processFormRequest(pageContext, webBean);    if(pageContext.getParameter("Submit")!=null)  {   upLoadFile(pageContext,webBean);      } }   -- Use Following Code if want to Upload Files in Local Machine -- ----------------------------------------------------------------------------------- public void upLoadFile(OAPageContext pageContext,OAWebBean webBean) { String filePath = "D:\\PRajkumar";  System.out.println("Default File Path---->"+filePath);  String fileUrl = null;  try  {   DataObject fileUploadData =  pageContext.getNamedDataObject("MessageFileUpload"); //FileUploading is my MessageFileUpload Bean Id   if(fileUploadData!=null)   {    String uFileName = (String)fileUploadData.selectValue(null, "UPLOAD_FILE_NAME");  // include this line    String contentType = (String) fileUploadData.selectValue(null, "UPLOAD_FILE_MIME_TYPE");  // For Mime Type    System.out.println("User File Name---->"+uFileName);    FileOutputStream output = null;    InputStream input = null;    BlobDomain uploadedByteStream = (BlobDomain)fileUploadData.selectValue(null, uFileName);    System.out.println("uploadedByteStream---->"+uploadedByteStream);                               File file = new File("D:\\PRajkumar", uFileName);    System.out.println("File output---->"+file);    output = new FileOutputStream(file);    System.out.println("output----->"+output);    input = uploadedByteStream.getInputStream();    System.out.println("input---->"+input);    byte abyte0[] = new byte[0x19000];    int i;         while((i = input.read(abyte0)) > 0)    output.write(abyte0, 0, i);    output.close();    input.close();   }  }  catch(Exception ex)  {   throw new OAException(ex.getMessage(), OAException.ERROR);  }     }   -- Use Following Code if want to Upload File into Server -- ------------------------------------------------------------------------- public void upLoadFile(OAPageContext pageContext,OAWebBean webBean) { String filePath = "/u01/app/apnac03r12/PRajkumar/";  System.out.println("Default File Path---->"+filePath);  String fileUrl = null;  try  {   DataObject fileUploadData =  pageContext.getNamedDataObject("MessageFileUpload");  //FileUploading is my MessageFileUpload Bean Id     if(fileUploadData!=null)   {    String uFileName = (String)fileUploadData.selectValue(null, "UPLOAD_FILE_NAME");   // include this line    String contentType = (String) fileUploadData.selectValue(null, "UPLOAD_FILE_MIME_TYPE");   // For Mime Type    System.out.println("User File Name---->"+uFileName);    FileOutputStream output = null;    InputStream input = null;    BlobDomain uploadedByteStream = (BlobDomain)fileUploadData.selectValue(null, uFileName);    System.out.println("uploadedByteStream---->"+uploadedByteStream);                               File file = new File("/u01/app/apnac03r12/PRajkumar", uFileName);    System.out.println("File output---->"+file);    output = new FileOutputStream(file);    System.out.println("output----->"+output);    input = uploadedByteStream.getInputStream();    System.out.println("input---->"+input);    byte abyte0[] = new byte[0x19000];    int i;         while((i = input.read(abyte0)) > 0)    output.write(abyte0, 0, i);    output.close();    input.close();   }  }  catch(Exception ex)  {   throw new OAException(ex.getMessage(), OAException.ERROR);  }     }   10. Congratulation you have successfully finished. Run Your page and Test Your Work           -- Used Code to Upload files into Server   -- Before Upload files into Server     -- After Upload files into Server       -- Used Code to Upload files into Local Machine   -- Before Upload files into Local Machine       -- After Upload files into Local Machine

    Read the article

  • Removing offline/defunct files in SQL server 2008

    - by philox
    How to remove traces of files marked as OFFLINE or DEFUNCT in Microsoft SQL server 2008? I have been playing around with a setup where I create a database with 3 file-groups which are: Primary, FileGroupData and FileGroupIndex. The clustered index is using FileGroupData and a non-clustered index is set to use FileGroupIndex. To simulate a disk failure I've shut down SQL server and manually deleted the files in index file-group. To start the database I'll mark the files 'OFFLINE', but after that I can't delete the index files, which are now offline. I don't have backup of the files as they are merely indices, but that has the implication that I can't restore the files and have their status as "ONLINE". How would you recommend removing the files and the file-group as they still show up in management studio under files/file-groups. Management studio is not able to delete them. As far as I can tell this is different from the question posted in : http://stackoverflow.com/questions/462637/how-do-i-remove-offline-files-from-a-sql-server-2005-database /Philip

    Read the article

  • Android: ProgressDialog.show() crashes with getApplicationContext

    - by Felix
    I can't seem to grasp why this is happening. This code: mProgressDialog = ProgressDialog.show(this, "", getString(R.string.loading), true); works just fine. However, this code: mProgressDialog = ProgressDialog.show(getApplicationContext(), "", getString(R.string.loading), true); throws the following exception: W/WindowManager( 569): Attempted to add window with non-application token WindowToken{438bee58 token=null}. Aborting. D/AndroidRuntime( 2049): Shutting down VM W/dalvikvm( 2049): threadid=3: thread exiting with uncaught exception (group=0x4001aa28) E/AndroidRuntime( 2049): Uncaught handler: thread main exiting due to uncaught exception E/AndroidRuntime( 2049): java.lang.RuntimeException: Unable to start activity ComponentInfo{com.tastekid.TasteKid/com.tastekid.TasteKid.YouTube}: android.view.WindowManager$BadTokenException: Unable to add window -- token null is not for an application E/AndroidRuntime( 2049): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2401) E/AndroidRuntime( 2049): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2417) E/AndroidRuntime( 2049): at android.app.ActivityThread.access$2100(ActivityThread.java:116) E/AndroidRuntime( 2049): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1794) E/AndroidRuntime( 2049): at android.os.Handler.dispatchMessage(Handler.java:99) E/AndroidRuntime( 2049): at android.os.Looper.loop(Looper.java:123) E/AndroidRuntime( 2049): at android.app.ActivityThread.main(ActivityThread.java:4203) E/AndroidRuntime( 2049): at java.lang.reflect.Method.invokeNative(Native Method) E/AndroidRuntime( 2049): at java.lang.reflect.Method.invoke(Method.java:521) E/AndroidRuntime( 2049): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:791) E/AndroidRuntime( 2049): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:549) E/AndroidRuntime( 2049): at dalvik.system.NativeStart.main(Native Method) E/AndroidRuntime( 2049): Caused by: android.view.WindowManager$BadTokenException: Unable to add window -- token null is not for an application E/AndroidRuntime( 2049): at android.view.ViewRoot.setView(ViewRoot.java:460) E/AndroidRuntime( 2049): at android.view.WindowManagerImpl.addView(WindowManagerImpl.java:177) E/AndroidRuntime( 2049): at android.view.WindowManagerImpl.addView(WindowManagerImpl.java:91) E/AndroidRuntime( 2049): at android.app.Dialog.show(Dialog.java:238) E/AndroidRuntime( 2049): at android.app.ProgressDialog.show(ProgressDialog.java:107) E/AndroidRuntime( 2049): at android.app.ProgressDialog.show(ProgressDialog.java:90) E/AndroidRuntime( 2049): at com.tastekid.TasteKid.YouTube.onCreate(YouTube.java:45) E/AndroidRuntime( 2049): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1123) E/AndroidRuntime( 2049): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2364) E/AndroidRuntime( 2049): ... 11 more Any ideas why this is happening? I'm calling this from the onCreate method.

    Read the article

  • jQuery hiding a div on click outside

    - by Lee
    Hi All I would like to build a simple menu for each list element clicked on but hide this div once you click outside it. Here is some simple code which hopefully will make sense. $('.drillFolder').click(function(){ var id = $(this).attr('data-folder'); $(".drillDownFolder ul li > a").attr('data-id', id); $(".drillDownFolder").show(); }); $("body").click(function(e){ if(e.target.className !== "drillDownFolder") { $(".drillDownFolder").hide(); } }); //The hidden div <div class="drillDownFolder" style="display:none"> <ul> <li><a href="#" data-id="">Show Image</a></li> <li><a href="#" data-id="">Edit Image</a></li> </ul> </div> I know whats wrong as the menu is show via the .drillFolder links the body click is then hiding it straight away. How can I avoid this. Thank you if you can advise

    Read the article

  • UIAlertView -show causing a memory leak

    - by Erik
    I'm relatively new to iPhone Development, so this may be my fault, but it goes against what I've seen. :) I think that I'm creating a UIAlertView that lives just in this vaccuum of the 'if' statement. NSData *data = [NSURLConnection sendSynchronousRequest:request returningResponse:&response error:&error]; if(!data) { // Add an alert UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Unable to contact server" delegate:nil cancelButtonTitle:@"Ok" otherButtonTitles:nil]; NSLog(@"retain count before show: %i", alert.retainCount); [alert show]; NSLog(@"retain count before release: %i", alert.retainCount); [alert release]; NSLog(@"retain count after release: %i", alert.retainCount); return nil; } However, the console logs baffle me. retain count before show: 1 retain count before release: 6 retain count after release: 5 I've tried also adding: alert = nil; after the release. That makes the retain count 0, but I still show a leak. And if it helps, the leak's Responsible Frame is UIKeyboardInputManagerClassForInputMode. I'm also using OS 4 Beta 3. So anyone have any ideas how a local UIAlertView's retain count would increment itself by 5 when calling -show? Thanks for your help!

    Read the article

  • Yii , show tooltip in cgridview( table ) value

    - by Kiran
    I want to show tooltip in cgridview value as on hover on column it have to show whole contant stored in variable. I want to show contant in variable $data["comment"] as a tooltip ( title ), and currently it shows whole string as - $data["comment"]. array( 'name'=>'Comment', 'header'=>'Comment', 'value'=>'(strlen($data["comment"])>35)?substr($data["comment"], 0, 35)."..":$data["comment"];', 'htmlOptions'=>array('title'=>'$data["comment"]'), // this what i have do ),

    Read the article

  • What do I need in order to extract and combine text files from multiple ZIP files, via command line?

    - by Iszi
    I've got an interesting scripting challenge in front of me. I'm fairly certain there's a way to do it, but I feel like I'm probably lacking some particular tools and/or functional knowledge. There's some fifty-plus ZIP files that each contain, among other things, text files that need to be merged with one another. The structure is something like this: C:\Reports\FirstJob-1.zip |-MyName |-FirstJob |-1 |-[Some other folders] |-TXTReports |-English |-[Some other files] |-Report.txt C:\Reports\FirstJob-2.zip |-MyName |-FirstJob |-1 |-[Some other folders] |-TXTReports |-English |-[Some other files] |-Report.txt C:\Reports\SecondJob-1.zip |-MyName |-SecondJob |-1 |-[Some other folders] |-TXTReports |-English |-[Some other files] |-Report.txt If I had all the Report.txt files in one regular folder, and uniquely named, I could probably just write a FOR statement that targets *.txt and runs something like type filename.txt >> Consolidated.txt on each. However, these all have the same file name and are embedded deep within separate ZIP files. The potentially useful tools I currently have at my disposal are Windows XP Professional SP3, PowerShell, and WinZip. I'd rather not download or install anything else, but I do understand that third-party tools (or additional tools from Microsoft or WinZip) may be necessary. Whatever tools I use should run natively in Windows. I really don't want to have to mess with Cygwin or other emulators on this system. At the very least, I need a tool that will allow me to analyze and manipulate ZIP files from the command line. Also, are there any other particular complications to this that I've not yet thought of?

    Read the article

  • jquery show/hide integrated with toggle/accordion effect

    - by mark
    I have a show/hide toggle working well in multiple instances (thanks to help here - search for 'jquery toggle to work in multiple instances'). I want to integrate it into an expanding menu / accordion style for the main categories. I have a script and it works on its own but I can't get it to work integrated with the show/hide. Any help greatly appreciated. The working show/hide: http://pastebin.me/c69869d7a80fdb439ca16304b821c146 the expanding menu script I want to integrate: http://pastebin.me/03b685f586fef84193b5fd72e815255d

    Read the article

  • Handling Hide/Show dock icon menu in AIR on OS X

    - by Alan
    I'm trying to figure out how to access the Show/Hide option that OS X automatically adds to the dock icon menu. The problem is that no matter what I do to hide my app, the dock icon menu will always show Hide and only if I click that option does it switch to Show. I want to have my app toggle visibility using the Invoke event but if a user hides the app that way and then right clicks the dock icon, they won't see Show, just Hide. Is there an event I can monitor for it? Or that I can trigger? I just want to have that menu option status be synced to whatever visibility status I set programatically. This has been driving me nuts!

    Read the article

< Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >