Search Results

Search found 2899 results on 116 pages for 'zip'.

Page 20/116 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • evaluation of a java thread dump

    - by raticulin
    I got a thread dump of one of my processes. It has a bunch of these threads. I guess they are keeping a bunch of memory so I am getting OOM. "Thread-8264" prio=6 tid=0x4c94ac00 nid=0xf3c runnable [0x4fe7f000] java.lang.Thread.State: RUNNABLE at java.util.zip.Inflater.inflateBytes(Native Method) at java.util.zip.Inflater.inflate(Inflater.java:223) - locked <0x0c9bc640 (a java.util.zip.Inflater) at org.apache.commons.compress.archivers.zip.ZipArchiveInputStream.read(ZipArchiveInputStream.java:235) at com.my.ZipExtractorCommonsCompress.extract(ZipExtractorCommonsCompress.java:48) at com.my.CustomThreadedExtractorWrapper$ExtractionThread.run(CustomThreadedExtractorWrapper.java:151) Locked ownable synchronizers: - None "Thread-8241" prio=6 tid=0x4c94a400 nid=0xb8c runnable [0x4faef000] java.lang.Thread.State: RUNNABLE at java.util.zip.Inflater.inflateBytes(Native Method) at java.util.zip.Inflater.inflate(Inflater.java:223) - locked <0x0c36b808 (a java.util.zip.Inflater) at org.apache.commons.compress.archivers.zip.ZipArchiveInputStream.read(ZipArchiveInputStream.java:235) at com.my.ZipExtractorCommonsCompress.extract(ZipExtractorCommonsCompress.java:48) at com.my.CustomThreadedExtractorWrapper$ExtractionThread.run(CustomThreadedExtractorWrapper.java:151) Locked ownable synchronizers: - None I am trying to find out how it arrived to this situation. CustomThreadedExtractorWrapper is a wrapper class that fires a thread to do some work (ExtractionThread, which uses ZipExtractorCommonsCompress to extract zip contents from a compressed stream). If the task is taking too long, ExtractionThread.interrupt(); is called to cancel the operation. I can see in my logs that the cancellation happened 25 times. And I see 21 of these threads in my dump. My questions: What is the status of these threads? Alive and running? Blocked somehow? They did not die with .interrupt() apparently? Is there a sure way to really kill a thread? What does really mean 'locked ' in the stack trace? Line 223 in Inflater.java is: public synchronized int inflate(byte[] b, int off, int len) { ... //return is line 223 return inflateBytes(b, off, len); }

    Read the article

  • How to modify a xml file using PHP? use a node to retrieve the value of new child nodes and add them

    - by Avinash Sonee
    I have pasted the example of what I need here : http://pastie.org/1005178 I have a xml file with say the following info <state> <info> <name>hello</name> <zip>51678</zip> </info> <info> <name>world</name> <zip>54678</zip> </info> </state> Now I need to create a new xml file which has the following <state> <info> <name>hello</name> <zip>51678</zip> <lat>17.89</lat> <lon>78.90</lon> </info> <info> <name>world</name> <zip>54678</zip> <lat>16.89</lat> <lon>83.45</lon> </info> </state> So, basically I need to add lat and lon nodes to each info element. these lat and lon correspond to the zip code which are stored in a mysql file which contains 3 fields - pin, lat and lon How do i do this recursively. I have say 1000 info elements in a file Please give me a sample code because I have already tried using simplexml and read the documentation but doesn't see to understand it properly

    Read the article

  • select all values from a dimension for which there are facts in all other dimensions

    - by ideasculptor
    I've tried to simplify for the purposes of asking this question. Hopefully, this will be comprehensible. Basically, I have a fact table with a time dimension, another dimension, and a hierarchical dimension. For the purposes of the question, let's assume the hierarchical dimension is zip code and state. The other dimension is just descriptive. Let's call it 'customer' Let's assume there are 50 customers. I need to find the set of states for which there is at least one zip code in which EVERY customer has at least one fact row for each day in the time dimension. If a zip code has only 49 customers, I don't care about it. If even one of the 50 customers doesn't have a value for even 1 day in a zip code, I don't care about it. Finally, I also need to know which zip codes qualified the state for selection. Note, there is no requirement that every zip code have a full data set - only that at least one zip code does. I don't mind making multiple queries and doing some processing on the client side. This is a dataset that only needs to be generated once per day and can be cached. I don't even see a particularly clean way to do it with multiple queries short of simply brute-force iteration, and there are a heck of a lot of 'zip codes' in the data set (not actually zip codes, but the there are approximately 100,000 entries in the lower level of the hierarchy and several hundred in the top level, so zipcode-state is a reasonable analogy)

    Read the article

  • What's causing this permission's error and how can I work around it?

    - by Scott B
    Warning: move_uploaded_file(/home/site/public_html/wp-content/themes/mytheme/upgrader.zip) [function.move-uploaded-file]: failed to open stream: Permission denied in /home/site/public_html/wp-content/themes/mytheme/uploader.php on line 79 Warning: move_uploaded_file() [function.move-uploaded-file]: Unable to move '/tmp/phptempfile' to '/home/site/public_html/wp-content/themes/mytheme/upgrader.zip' in /home/site/public_html/wp-content/themes/mytheme/uploader.php on line 79 There was a problem. Sorry! Code is below for that line... // permission settings for newly created folders $chmod = 0755; // Ensures that the correct file was chosen $accepted_types = array('application/zip', 'application/x-zip-compressed', 'multipart/x-zip', 'application/s-compressed'); foreach($accepted_types as $mime_type) { if($mime_type == $type) { $okay = true; break; } } $okay = strtolower($name[1]) == 'zip' ? true: false; if(!$okay) { die("This upgrader requires a zip file. Please make sure your file is a valid zip file with a .zip extension"); } //mkdir($target); $saved_file_location = $target . $filename; //Next line is 79 if(move_uploaded_file($source, $saved_file_location)) { openZip($saved_file_location); } else { die("There was a problem. Sorry!"); }

    Read the article

  • "The binary you uploaded was invalid. the file was not a valid zip file" Error message uploading app

    - by Keith Loughnane
    Hi, I'm trying to upload an iPhone app binary to iTunesConnect and keep getting the following error message "The binary you upload was invalid. the file was not a valid zip file". I had an app upload ok recently but this app is having problems. So after a while I carefully went through the following steps trying to make sure everything was ok. Any help is appreciated. The steps: renamed the project (Project-Rename... enter name into Rename project to:) to release name making sure the name has no spaces. Cleaned project Make sure references in build setting reflect new app name Create new app ID matching project name in iPhone Provisioning Portal Destroyed old developer and distributer provisioning profiles in Provisioning Portal, in XCode and on iPhone. Create new development provisioning profile using new app name. Install development provisioning profile into XCode 8) Build (Release) for iPhone OS 3.1.3 (highest my phone will upgrade to, I'm assuming current released version) Builds, Installs and Runs on actual iPhone: To me this implys App and developer ID's are OK. Create a distributor provisioning profile using existing Distributer ID. Install distributer ID into XCode Clean Checked that "Code Signing Identity" and "Any iPhone OS Device" lines in Build settings are set to Distributor ID Build for release for OS 3.1.3 Check Build results to make sure code is signed with Distributor Profile Reveal .app file and compress (alt click Compress "appName.app") Upload to iTunes connect Gives "The binary you uploaded was invalid. The file was not a valid zip file"

    Read the article

  • Error in unzipping a file from a python script running as daemon

    - by Fedrick
    I am getting an error whenever i try to run following unzip command from a python script which is running as a daemon Command : unzip abcd.zip /dev/null Error End-of-central-directory signature not found$ a zip file, or it constitutes one disk of a multi-part archive. In the latter case the central directory and zipfile comment will be found on the last disk(s) of this archive unzip: cannot find zipfile directory in one of abcd.zip$ abcd.zip.zip, and cannot find abcd.zip.ZIP, period. Could anyone help me in this regard? Thanks in advance.

    Read the article

  • Validate Canadian Postal Code Regex

    - by Alex Block
    I have a script written in JavaScript to Validate Canadian Postal Codes using Regex, however it does not seem to be working. Here is the script: If statement: if (myform.zip.value == "" || myform.zip.value == null || myform.zip.value == "Postal Code" || myform.zip.value.length < 12 ) { alert("Please fill in field Postal Code. You should only enter 7 characters"); myform.zip.focus(); return false; } Function: function okNumber(myform) { var regex = /^[ABCEGHJKLMNPRSTVXY]{1}\d{1}[A-Z]{1} *\d{1}[A-Z]{1}\d{1}$/; if (regex.test(myform.zip.value) == false) { alert("Input Valid Postal Code"); myform.zip.focus(); return false; } return true; }

    Read the article

  • UIWebView loads a local file for a long time

    - by Knodel
    I have a UIWebView which loads .rtfd.zip files like this: -(void)viewWillAppear:(BOOL)animated { self.title=@"Title"; if(rowPosledovatelnosti==0) { [self loadFile:@"PosledovatelnostiObzie.rtfd.zip"]; } if(rowPosledovatelnosti==1) { [self loadFile:@"Arifmeticheskaya.rtfd.zip"]; } if(rowPosledovatelnosti==2) { [self loadFile:@"Svojstva_Arifmeticheskoj.rtfd.zip"]; } if(rowPosledovatelnosti==3) { [self loadFile:@"Geometricheskaya.rtfd.zip"]; } if(rowPosledovatelnosti==4) { [self loadFile:@"Predel_Posledovatelnosti.rtfd.zip"]; } if(rowPosledovatelnosti==5) { [self loadFile:@"Summa_Geometricheskoy.rtfd.zip"]; } } But on the device it takes some time for the UIWebView to load the content. Is there any way to optimize the loading time?

    Read the article

  • In BASH how can i find my system on active internet interface, what is the upload speed?

    - by YumYumYum
    I am trying to write an TUI bandwidth trace application which on query can instantly tell me, that my download and upload speed is XXXX. I have figured out that download i can use with wget and parse it using BASH, but how do i get the upload speed? Example of download parse method: 1) Remote download : wget http://x.x.com:7007/files/software/vnc.zip Length: 1594344 (1.5M) [application/zip] Saving to: `vnc.zip' 100%[==================================================================>] 1,594,344 573K/s in 2.7s 2012-03-24 11:35:22 (573 KB/s) - `vnc.zip' saved [1594344/1594344] 2) Local download tells Length: 1594344 (1.5M) [application/zip] Saving to: `vnc.zip' 100%[==================================================================>] 1,594,344 --.-K/s in 0.1s 2012-03-24 06:43:04 (11.4 MB/s) - `vnc.zip' saved [1594344/1594344]

    Read the article

  • Unlink deleting more than just the files passed to it.

    - by RMcLeod
    I'm creating an SQL file, placing this file into a zip file with some images and then deleting the SQL file with unlink.Strange thing is it deletes the zip file as well. if (file_put_contents($sqlFileName, $sql) !== false) { $zip = new ZipArchive; if ($zip->open($workingDir . $now . '.zip', ZipArchive::CREATE) === true) { $zip->addFile($sqlFileName, basename($sqlFileName)); if(! empty($images)) { foreach ($images as $image) { $zip->addFile($imagesDir . $image, $image); } } } unlink($sqlFileName); }

    Read the article

  • Neo4J and Azure and VS2012 and Windows 8

    - by Chris Skardon
    Now, I know that this has been written about, but both of the main places (http://www.richard-banks.org/2011/02/running-neo4j-on-azure.html and http://blog.neo4j.org/2011/02/announcing-neo4j-on-windows-azure.html) utilise VS2010, and well, I’m on VS2012 and Windows 8. Not that I think Win 8 had anything to do with it really, anyhews! I’m going to begin from the beginning, this is my first foray into running something on Azure, so it’s been a bit of a learning curve. But luckily the Neo4J guys have got us started, so let’s download the VS2010 solution: http://neo4j.org/get?file=Neo4j.Azure.Server.zip OK, the other thing we’ll need is the VS2012 Azure SDK, so let’s get that as well: http://www.windowsazure.com/en-us/develop/downloads/ (I just did the full install). Now, unzip the VS2010 solution and let’s open it in VS2012: <your location>\Neo4j.Azure.Server\Neo4j.Azure.Server.sln One-way-upgrade? Yer! Ignore the migration report – we don’t care! Let’s build that sucker… Ahhh 14 errors… WindowsAzure does not exist in the namespace ‘Microsoft’ Not a problem right? We’ve installed the SDK, just need to update the references: We can ignore the Test projects, they don’t use Azure, we’re interested in the other projects, so what we’ll do is remove the broken references, and add the correct ones, so expand the references bit of each project: hunt out those yellow exclamation marks, and delete them! You’ll need to add the right ones back in (listed below), when you go to the ‘Add Reference’ dialog make sure you have ‘Assemblies’ and ‘Framework’ selected before you seach (and search for ‘microsoft.win’ to narrow it down) So the references you need for each project are: CollectDiagnosticsData Microsoft.WindowsAzure.Diagnostics Microsoft.WindowsAzure.StorageClient Diversify.WindowsAzure.ServiceRuntime Microsoft.WindowsAzure.CloudDrive Microsoft.WindowsAzure.ServiceRuntime Microsoft.WindowsAzure.StorageClient Right, so let’s build again… Sweet! No errors.   Now we need to setup our Blobs, I’m assuming you are using the most up-to-date Java you happened to have downloaded :) in my case that’s JRE7, and that is located in: C:\Program Files (x86)\Java\jre7 So, zip up that folder into whatever you want to call it, I went with jre7.zip, and stuck it in a temp folder for now. In that same temp folder I also copied the neo4j zip I was using: neo4j-community-1.7.2-windows.zip OK, now, we need to get these into our Blob storage, this is where a lot of stuff becomes unstuck - I didn’t find any applications that helped me use the blob storage, one would crash (because my internet speed is so slow) and the other just didn’t work – sure it looked like it had worked, but when push came to shove it didn’t. So this is how I got my files into Blob (local first): 1. Run the ‘Storage Emulator’ (just search for that in the start menu) 2. That takes a little while to start up so fire up another instance of Visual Studio in the mean time, and create a new Console Application. 3. Manage Nuget Packages for that solution and add ‘Windows Azure Storage’ Now you’re set up to add the code: public static void Main() { CloudStorageAccount cloudStorageAccount = CloudStorageAccount.DevelopmentStorageAccount; CloudBlobClient client = cloudStorageAccount.CreateCloudBlobClient(); client.Timeout = TimeSpan.FromMinutes(30); CloudBlobContainer container = client.GetContainerReference("neo4j"); //This will create it as well   UploadBlob(container, "jre7.zip", "c:\\temp\\jre7.zip"); UploadBlob(container, "neo4j-community-1.7.2-windows.zip", "c:\\temp\\neo4j-community-1.7.2-windows.zip"); }   private static void UploadBlob(CloudBlobContainer container, string blobName, string filename) { CloudBlob blob = container.GetBlobReference(blobName);   using (FileStream fileStream = File.OpenRead(filename)) blob.UploadFromStream(fileStream); } This will upload the files to your local storage account (to switch to an Azure one, you’ll need to create a storage account, and use those credentials when you make your CloudStorageAccount above) To test you’ve got them uploaded correctly, go to: http://localhost:10000/devstoreaccount1/neo4j/jre7.zip and you will hopefully download the zip file you just uploaded. Now that those files are there, we are ready for some final configuration… Right click on the Neo4jServerHost role in the Neo4j.Azure.Server cloud project: Click on the ‘Settings’ tab and we’ll need to do some changes – by default, the 1.7.2 edition of neo4J unzips to: neo4j-community-1.7.2 So, we need to update all the ‘neo4j-1.3.M02’ directories to be ‘neo4j-community-1.7.2’, we also need to update the Java runtime location, so we start with this: and end with this: Now, I also changed the Endpoints settings, to be HTTP (from TCP) and to have a port of 7410 (mainly because that’s straight down on the numpad) The last ‘gotcha’ is some hard coded consts, which had me looking for ages, they are in the ‘ConfigSettings’ class of the ‘Neo4jServerHost’ project, and the ones we’re interested in are: Neo4jFileName JavaZipFileName Change those both to what that should be. OK Nearly there (I promise)! Run the ‘Compute Emulator’ (same deal with the Start menu), in your system tray you should have an Azure icon, when the compute emulator is up and running, right click on the icon and select ‘Show Compute Emulator UI’ The last steps! Make sure the ‘Neo4j.Azure.Server’ cloud project is set up as the start project and let’s hit F5 tension mounts, the build takes place (you need to accept the UAC warning) and VS does it’s stuff. If you look at the Compute Emulator UI you’ll see some log stuff (which you’ll need if this goes awry – but it won’t don’t worry!) In a bit, the console and a Java window will pop up: Then the console will bog off, leaving just the Java one, and if we switch back to the Compute Emulator UI and scroll up we should be able to see a line telling us the port number we’ve been assigned (in my case 7411): (If you can’t see it, don’t worry.. press CTRL+A on the emulator, then CTRL+C, copy all the text and paste it into something like Notepad, then just do a Find for ‘port’ you’ll soon see it) Go to your favourite browser, and head to: http://localhost:YOURPORT/ and you should see the WebAdmin! See you on the cloud side hopefully! Chris PS Other gotchas! OK, I’ve been caught out a couple of times: I had an instance of Neo4J running as a service on my machine, the Azure instance wanted to run the https version of the server on the same port as the Service was running on, and so Java would complain that the port was already in use.. The first time I converted the project, it didn’t update the version of the Azure library to load, in the App.Config of the Neo4jServerHost project, and VS would throw an exception saying it couldn’t find the Azure dll version 1.0.0.0.

    Read the article

  • FTP script download from linux to windows

    - by user53864
    I'm using following FTP script on windows xp to download zip files from ubuntu cloud servers. A zip file is created every day on ubutnu servers and I will download it to windows via this ftp script. I run this script everyday manually as I have to edit the last line(mget /usr/backup_02-11-2010.Zip) of the script to match today's date. I want to edit this script so that it will download only today's zip file at the scheduled time without needing to edit it everyday, when scheduled. It's clear that date is appended to the zip files and is in the format dd-mm-yyyy. Need help... open server-ip-here username-here user-password-here lcd C:\Backup\files bin hash prompt mget /usr/backup_02-11-2010.zip

    Read the article

  • upload file on database through php code

    - by ruhit
    hi all I have made an application to upload files and its workingout well.now I want to upload my files on database and I also want to display the uploaded files names on my listby accessing the database....so kindly help me. my codes are given below- function uploadFile() { global $template; //$this->UM_index = $this->session->getUserId(); switch($_REQUEST['cmd']){ case 'upload': $filename = array(); //set upload directory //$target_path = "F:" . '/uploaded/'; for($i=0;$i<count($_FILES['ad']['name']);$i++){ if($_FILES["ad"]["name"]) { $filename = $_FILES["ad"]["name"][$i]; $source = $_FILES["ad"]["tmp_name"][$i]; $type = $_FILES["ad"]["type"]; $name = explode(".", $filename); $accepted_types = array('text/html','application/zip', 'application/x-zip-compressed', 'multipart/x-zip', 'application/x-compressed'); foreach($accepted_types as $mime_type) { if($mime_type == $type) { $okay = true; break; } } $continue = strtolower($name[1]) == 'zip' ? true : false; if(!$continue) { $message = "The file you are trying to upload is not a .zip file. Please try again."; } $target_path = "F:" . '/uploaded/'.$filename; // change this to the correct site path if(move_uploaded_file($source, $target_path )) { $zip = new ZipArchive(); $x = $zip->open($target_path); if ($x === true) { $zip->extractTo("F:" . '/uploaded/'); // change this to the correct site path $zip->close(); unlink($target_path); } $message = "Your .zip file was uploaded and unpacked."; } else { $message = "There was a problem with the upload. Please try again."; } } } echo "Your .zip file was uploaded and unpacked."; $template->main_content = $template->fetch(TEMPLATE_DIR . 'donna1.html'); break; default: $template->main_content = $template->fetch(TEMPLATE_DIR . 'donna1.html'); //$this->assign_values('cmd','uploads'); $this->assign_values('cmd','upload'); } } my html page is <html> <link href="css/style.css" rel="stylesheet" type="text/css"> <!--<form action="{$path_site}{$index_file}" method="post" enctype="multipart/form-data">--> <form action="index.php?menu=upload_file&cmd=upload" method="post" enctype="multipart/form-data"> <div id="main"> <div id="login"> <br /> <br /> Ad No 1: <input type="file" name="ad[]" id="ad1" size="10" />&nbsp;&nbsp;Image(.zip)<input type="file" name="ad[]" id="ad1" size="10" /> Sponsor By : <input type="text" name="ad3" id="ad1" size="25" /> <br /> <br /> </div> </div> </form> </html>

    Read the article

  • Internationalize WebCenter Portal - Content Presenter

    - by Stefan Krantz
    Lately we have been involved in engagements where internationalization has been holding the project back from success. In this post we are going to explain how to get Content Presenter and its editorials to comply with the current selected locale for the WebCenter Portal session. As you probably know by now WebCenter Portal leverages the Localization support from Java Server Faces (JSF), in this post we will assume that the localization is controlled and enforced by switching the current browsers locale between English and Spanish. There is two main scenarios in internationalization of a content enabled pages, since Content Presenter offers both presentation of information as well as contribution of information, in this post we will look at how to enable seamless integration of correct localized version of the back end content file and how to enable the editor/author to edit the correct localized version of the file based on the current browser locale. Solution Scenario 1 - Localization aware content presentation Due to the amount of steps required to implement the enclosed solution proposal I have decided to share the solution with you in group components for each facet of the solution. If you want to get more details on each step, you can review the enclosed components. This post will guide you through the steps of enabling each component and what it enables/changes in each section of the system. Enable Content Presenter Customization By leveraging a predictable naming convention of the data files used to hold the content for the Content Presenter instance we can easily develop a component that will dynamically switch the name out before presenting the information. The naming convention we have leverage is the industry best practice by having a shared identifier as prefix (ContentABC) and a language enabled suffix (_EN) (_ES). So the assumption is that each file pair in above example should look like following:- English version - (ContentABC_EN)- Spanish version - (ContentABC_ES) Based on above theory we can now easily regardless of the primary version assigned to the content presenter instance switch the language out by using the localization support from JSF. Below java bean (oracle.webcenter.doclib.internal.view.presenter.NLSHelperBean) is enclosed in the customization project available for download at the bottom of the post: 1: public static final String CP_D_DOCNAME_FORMAT = "%s_%s"; 2: public static final int CP_UNIQUE_ID_INDEX = 0; 3: private ContentPresenter presenter = null; 4:   5:   6: public NLSHelperBean() { 7: super(); 8: } 9:   10: /** 11: * This method updates the configuration for the pageFlowScope to have the correct datafile 12: * for the current Locale 13: */ 14: public void initLocaleForDataFile() { 15: String dataFile = null; 16: // Checking that state of presenter is present, also make sure the item is eligible for localization by locating the "_" in the name 17: if(presenter.getConfiguration().getDatasource() != null && 18: presenter.getConfiguration().getDatasource().isNodeDatasource() && 19: presenter.getConfiguration().getDatasource().getNodeIdDatasource() != null && 20: !presenter.getConfiguration().getDatasource().getNodeIdDatasource().equals("") && 21: presenter.getConfiguration().getDatasource().getNodeIdDatasource().indexOf("_") > 0) { 22: dataFile = presenter.getConfiguration().getDatasource().getNodeIdDatasource(); 23: FacesContext fc = FacesContext.getCurrentInstance(); 24: //Leveraging the current faces contenxt to get current localization language 25: String currentLocale = fc.getViewRoot().getLocale().getLanguage().toUpperCase(); 26: String newDataFile = dataFile; 27: String [] uniqueIdArr = dataFile.split("_"); 28: if(uniqueIdArr.length > 0) { 29: newDataFile = String.format(CP_D_DOCNAME_FORMAT, uniqueIdArr[CP_UNIQUE_ID_INDEX], currentLocale); 30: } 31: //Replacing the current Node datasource with localized datafile. 32: presenter.getConfiguration().getDatasource().setNodeIdDatasource(newDataFile); 33: } 34: } With this bean code available to our WebCenter Portal implementation we can start the next step, by overriding the standard behavior in content presenter by applying a MDS Taskflow customization to the content presenter taskflow, following taskflow customization has been applied to the customization project attached to this post:- Library: WebCenter Document Library Service View- Path: oracle.webcenter.doclib.view.jsf.taskflows.presenter- File: contentPresenter.xml Changes made in above customization view:1. A new method invocation activity has been added (initLocaleForDataFile)2. The method invocation invokes the new NLSHelperBean3. The default activity is moved to the new Method invocation (initLocaleForDataFile)4. The outcome from the method invocation goes to determine-navigation (original default activity) The above changes concludes the presentation modification to support a compatible localization scenario for a content driven page. In addition this customization do not limit or disables the out of the box capabilities of WebCenter Portal. Steps to enable above customization Start JDeveloper and open your WebCenter Portal Application Select "Open Project" and include the extracted project you downloaded (CPNLSCustomizations.zip) Make sure the build out put from CPNLSCustomizations project is a dependency to your Portal project Deploy your Portal Application to your WC_CustomPortal managed server Make sure your naming convention of the two data files follow above recommendation Example result of the solution: Solution Scenario 2 - Localization aware content creation and authoring As you could see from Solution Scenario 1 we require the naming convention to be strictly followed, this means in the hands of a user with limited technology knowledge this can be one of the failing links in this solutions. Therefore I strongly recommend that you also follow this part since this will eliminate this risk and also increase the editors/authors usability with a magnitude. The current WebCenter Portal Architecture leverages WebCenter Content today to maintain, publish and manage content, therefore we need to make few efforts in making sure this part of the architecture is on board with our new naming practice and also simplifies the creation of content for our end users. As you probably remember the naming convention required a prefix to be common so I propose we enable a new component that help you auto name the content items dDocName (this means that the readable title can still be in a human readable format). The new component (WCP-LocalizationSupport.zip) built for this scenario will enable a couple of things: 1. A new service where a sequential number can be generate on request - service name: GET_WCP_LOCALE_CONTENTID 2. The content presenter is leveraging a specific function when launching the content creation wizard from within Content Presenter. Assumption is that users will create the content by clicking "Create Web Content" button. When clicking the button the wizard opened is actually running in side of WebCenter Content server, file executed (contentwizard.hcsp). This file uses JSON commands that will generate operations in the content server, I have extend this file to create two identical data files instead of one.- First it creates the English version by leveraging the new Service and a Global Rule to set the dDocName on the original check in screen, this global rule is available in a configuration package attached to this blog (NLSContentProfileRule.zip)- Secondly we run a set of JSON javascripts to create the Spanish version with the same details except for the name where we replace the suffix with (_ES)- Then content creation wizard ends with its out of the box behavior and assigns the Content Presenter instance the English versionSee Javascript markup below - this can be changed in the (WCP-LocalizationSupport.zip/component/WCP-LocalizationSupport/publish/webcenter) 1: //---------------------------------------A-TEAM--------------------------------------- 2: WCM.ContentWizard.CheckinContentPage.OnCheckinComplete = function(returnParams) 3: { 4: var callback = WCM.ContentWizard.CheckinContentPage.checkinCompleteCallback; 5: WCM.ContentWizard.ChooseContentPage.OnSelectionComplete(returnParams, callback); 6: // Load latest DOC_INFO_SIMPLE 7: var cgiPath = DOCLIB.config.httpCgiPath; 8: var jsonBinder = new WCM.Idc.JSONBinder(); 9: jsonBinder.SetLocalDataValue('IdcService', 'DOC_INFO_SIMPLE'); 10: jsonBinder.SetLocalDataValue('dID', returnParams.dID); 11: jsonBinder.Send(cgiPath, $CB(this, function(http) { 12: var ret = http.GetResponseText(); 13: var binder = new WCM.Idc.JSONBinder(ret); 14: var dDocName = binder.GetResultSetValue('DOC_INFO', 'dDocName', 0); 15: if(dDocName.indexOf("_") > 0){ 16: var ssBinder = new WCM.Idc.JSONBinder(); 17: ssBinder.SetLocalDataValue('IdcService', 'SS_CHECKIN_NEW'); 18: //Additional Localization dDocName generated 19: ssBinder.SetLocalDataValue('dDocName', getLocalizedDocName(dDocName, "es")); 20: ssBinder.SetLocalDataValue('primaryFile', 'default.xml'); 21: ssBinder.SetLocalDataValue('ssDefaultDocumentToken', 'SSContributorDataFile'); 22:   23: for(var n = 0 ; n < binder.GetResultSetFields('DOC_INFO').length ; n++) { 24: var field = binder.GetResultSetFields('DOC_INFO')[n]; 25: if(field != 'dID' && 26: field != 'dDocName' && 27: field != 'dID' && 28: field != 'dReleaseState' && 29: field != 'dRevClassID' && 30: field != 'dRevisionID' && 31: field != 'dRevLabel') { 32: ssBinder.SetLocalDataValue(field, binder.GetResultSetValue('DOC_INFO', field, 0)); 33: } 34: } 35: ssBinder.Send(cgiPath, $CB(this, function(http) {})); 36: } 37: })); 38: } 39:   40: //Support function to create localized dDocNames 41: function getLocalizedDocName(dDocName, lang) { 42: var result = dDocName.replace("_EN", ("_" + lang)); 43: return result; 44: } 45: //---------------------------------------A-TEAM--------------------------------------- 3. By applying the enclosed NLSContentProfileRule.zip, the check in screen for DataFile creation will have auto naming enabled with localization suffix (default is English)You can change the default language by updating the GlobalNlsRule and assign preferred prefix.See Rule markup for dDocName field below: <$executeService("GET_WCP_LOCALE_CONTENTID")$><$dprDefaultValue=WCP_LOCALE.LocaleContentId & "_EN"$> Steps to enable above extensions and configurations Install WebCenter Component (WCP-LocalizationSupport.zip), via the AdminServer in WebCenter Content Administration menus Enable the component and restart the content server Apply the configuration bundle to enable the new Global Rule (GlobalNlsRule), via the WebCenter Content Administration/Config Migration Admin New Content Creation Experience Result Content EditingContent editing will by default be enabled for authoring in the current select locale since the content file is selected by (Solution Scenario 1), this means that a user can switch his browser locale and then get the editing experience adaptable to the current selected locale. NotesA-Team are planning to post a solution on how to inline switch the locale of the WebCenter Portal Session, so the Content Presenter, Navigation Model and other Face related features are localized accordingly. Content Presenter examples used in this post is an extension to following post:https://blogs.oracle.com/ATEAM_WEBCENTER/entry/enable_content_editing_of_iterative Downloads CPNLSCustomizations.zip - WebCenter Portal, Content Presenter Customization https://blogs.oracle.com/ATEAM_WEBCENTER/resource/stefan.krantz/CPNLSCustomizations.zip WCP-LocalizationSupport.zip - WebCenter Content, Extension Component to enable localization creation of files with compliant auto naminghttps://blogs.oracle.com/ATEAM_WEBCENTER/resource/stefan.krantz/WCP-LocalizationSupport.zip NLSContentProfileRule.zip - WebCenter Content, Configuration Update Bundle to enable Global rule for new check in naming of data fileshttps://blogs.oracle.com/ATEAM_WEBCENTER/resource/stefan.krantz/NLSContentProfileRule.zip

    Read the article

  • commons-logging-1.1.jar; cannot read zip file entry

    - by user1226162
    I have imported a GWT project from GIT , but when i run maven Install it says .m2\repository\commons-logging\commons-logging\1.1\commons-logging-1.1.jar; cannot read zip file entry and if i simply run my application , i get this \git\my-Search-Engine\qsse\war}: java.lang.NoClassDefFoundError: com/google/inject/servlet/GuiceServletContextListener I tried to find out the way , one solution i found was to move the guice-servlet-3.0 from build path to \qsse\war\webinf\lib but if i do that i start gettin the exception ava.lang.NoClassDefFoundError: com/google/inject/Injector any idea how can i resolve this

    Read the article

  • How Do I get City State Zip in MVC 3 URL Route without writing a controller for every state and actions for each city

    - by OpTech Marketing
    I have the need to have the urls in my bosses application look like: http://domain.com/Texas/Dallas/72701 However, I don't want to write a controller for every state and an action for every city. routes.MapRoute( "DrillDown", // Route name "{controller}/{action}/{ZipId}", new { controller = "State", action = "City", ZipId = UrlParameter.Optional} // Parameter defaults Can someone help me write a pattern for the routes that will accept State/City/Zip without destroying the ability for me to have regular routes with controller/Action/Param ? Looking all over and can't find any direction.

    Read the article

  • Way to get a timezone from a zip code?

    - by Vaccano
    Anyone know of a web service or something out there that I could give a zip code and get a time zone back? Or is that something I just need to write myself? If so, any hints or guides on how to do that? I use visual studio 2008 and C#.

    Read the article

  • Mutating XML in Clojure

    - by mac
    Clojures clojure.xml/parse, clojure.zip/xml-zip and clojure.contrib.zip-filter.xml/xml- are excellent tools for pulling values out of xml, but what if I want to change the xml (the result of clojure.zip/xml-zip) based on what I learn from xml- "queries" and write the result back out as xml? I would have expected that (clojure.contrib.prxml/prxml (clojure.xml/parse xml-content)) spit back xml, but that is not the case.

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >