Search Results

Search found 282 results on 12 pages for 'extraction'.

Page 5/12 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Wrong extraction of .attr("href") in IE7 vs all other browsers?

    - by EmKay
    Can it really be true that the attr("href") command for a link is handled very different in IE7 in comparison to all other browsers? Let's say I have a page at http://example.com/page.html and I have this HTML: <a href="#someAnchor" class="lnkTest">Link text</a> and this jQuery: var strHref = $(".lnkTest").attr("href"); Then in IE7 the value of the strHref variable will be "http://example.com/page.htm#someAnchor" but in other browsers it will be "#someAnchor". I believe that the last mentioned case is the most correct one, so is it just a case of IE7 being a bad boy or is it a bug in jQuery?

    Read the article

  • details on the following Natural Language Processing terms ?

    - by wefwgeweg
    Named Entity Extraction (extract ppl, cities, organizations) Content Tagging (extract topic tags by scanning doc) Structured Data Extraction Topic Categorization (taxonomy classification by scanning doc....bayesian ) Text extraction (HTML page cleaning) are there libraries that i can use to do any of the above functions of NLP ? dont really feel like forking out cash to AlchemyAPI

    Read the article

  • Type errors when using same name

    - by lykimq
    I have 3 files: 1) cpf0.ml type string = char list type url = string type var = string type name = string type symbol = | Symbol_name of name 2) problem.ml: type symbol = | Ident of string 3) test.ml open Problem;; open Cpf0;; let symbol b = function | Symbol_name n -> Ident n When I combine test.ml: ocamlc -c test.ml. I received an error: This expression has type Cpf0.name = char list but an expression was expected of type string Could you please help me to correct it? Thank you very much EDIT: Thank you for your answer. I want to explain more about these 3 files: Because I am working with extraction in Coq to Ocaml type: cpf0.ml is generated from cpf.v : Require Import String. Definition string := string. Definition name := string. Inductive symbol := | Symbol_name : name -> symbol. The code extraction.v: Set Extraction Optimize. Extraction Language Ocaml. Require ExtrOcamlBasic ExtrOcamlString. Extraction Blacklist cpf list. where ExtrOcamlString I opened: open Cpf0;; in problem.ml, and I got a new problem because in problem.ml they have another definition for type string This expression has type Cpf0.string = char list but an expression was expected of type Util.StrSet.elt = string Here is a definition in util.ml defined type string: module Str = struct type t = string end;; module StrOrd = Ord.Make (Str);; module StrSet = Set.Make (StrOrd);; module StrMap = Map.Make (StrOrd);; let set_add_chk x s = if StrSet.mem x s then failwith (x ^ " already declared") else StrSet.add x s;; I was trying to change t = string to t = char list, but if I do that I have to change a lot of function it depend on (for example: set_add_chk above). Could you please give me a good idea? how I would do in this case.

    Read the article

  • How can I avoid a few seconds of blank video when using -vcodec copy?

    - by arlomedia
    I'm processing user-uploaded videos on a CentOS web server with ffmpeg. I need to convert each video to a standard size and format, then extract a 30-second sample clip from each video. I want to use the "-vcodec copy" flag in the extraction command to avoid encoding a second time. This command works for my initial conversion: ffmpeg -i uploaded.mov -f mp4 -vcodec libx264 -vpre medium -acodec libfaac -r 15 -b 360k -ab 48k -ar 22050 -s 480x320 formatted.mp4 And this sometimes works for the extraction: ffmpeg -i formatted.mp4 -vcodec copy -acodec copy -ss 0 -t 30 formatted_sample.mp4 However, when I run the extraction command on some videos, the extracted sample clip starts with several seconds of blank video. The audio starts right away but the video doesn't start for 3-6 seconds. To demonstrate the problem, I've uploaded two video clips and run the above commands on them. I created the first clip in Final Cut Express and encoded it with Handbrake before uploading to the web server: 1a) uploaded clip 1b) converted with first command 1c) extracted with second command, missing first six seconds By comparison, this second clip comes from Apple's website and does not show the problem: 2a) uploaded clip 2b) converted with first command 2c) extracted with second command, no problem Can anyone see what's different about the two source clips? And if so, is there anything I can do in my conversion command so that when the extraction command runs, the clip is set up to avoid the missing video? By the way, I initially had the problem with ffmpeg 0.6.1 installed from yum, but I upgraded to the latest git version and the problem remains.

    Read the article

  • Currency Conversion in Oracle BI applications

    - by Saurabh Verma
    Authored by Vijay Aggarwal and Hichem Sellami A typical data warehouse contains Star and/or Snowflake schema, made up of Dimensions and Facts. The facts store various numerical information including amounts. Example; Order Amount, Invoice Amount etc. With the true global nature of business now-a-days, the end-users want to view the reports in their own currency or in global/common currency as defined by their business. This presents a unique opportunity in BI to provide the amounts in converted rates either by pre-storing or by doing on-the-fly conversions while displaying the reports to the users. Source Systems OBIA caters to various source systems like EBS, PSFT, Sebl, JDE, Fusion etc. Each source has its own unique and intricate ways of defining and storing currency data, doing currency conversions and presenting to the OLTP users. For example; EBS stores conversion rates between currencies which can be classified by conversion rates, like Corporate rate, Spot rate, Period rate etc. Siebel stores exchange rates by conversion rates like Daily. EBS/Fusion stores the conversion rates for each day, where as PSFT/Siebel store for a range of days. PSFT has Rate Multiplication Factor and Rate Division Factor and we need to calculate the Rate based on them, where as other Source systems store the Currency Exchange Rate directly. OBIA Design The data consolidation from various disparate source systems, poses the challenge to conform various currencies, rate types, exchange rates etc., and designing the best way to present the amounts to the users without affecting the performance. When consolidating the data for reporting in OBIA, we have designed the mechanisms in the Common Dimension, to allow users to report based on their required currencies. OBIA Facts store amounts in various currencies: Document Currency: This is the currency of the actual transaction. For a multinational company, this can be in various currencies. Local Currency: This is the base currency in which the accounting entries are recorded by the business. This is generally defined in the Ledger of the company. Global Currencies: OBIA provides five Global Currencies. Three are used across all modules. The last two are for CRM only. A Global currency is very useful when creating reports where the data is viewed enterprise-wide. Example; a US based multinational would want to see the reports in USD. The company will choose USD as one of the global currencies. OBIA allows users to define up-to five global currencies during the initial implementation. The term Currency Preference is used to designate the set of values: Document Currency, Local Currency, Global Currency 1, Global Currency 2, Global Currency 3; which are shared among all modules. There are four more currency preferences, specific to certain modules: Global Currency 4 (aka CRM Currency) and Global Currency 5 which are used in CRM; and Project Currency and Contract Currency, used in Project Analytics. When choosing Local Currency for Currency preference, the data will show in the currency of the Ledger (or Business Unit) in the prompt. So it is important to select one Ledger or Business Unit when viewing data in Local Currency. More on this can be found in the section: Toggling Currency Preferences in the Dashboard. Design Logic When extracting the fact data, the OOTB mappings extract and load the document amount, and the local amount in target tables. It also loads the exchange rates required to convert the document amount into the corresponding global amounts. If the source system only provides the document amount in the transaction, the extract mapping does a lookup to get the Local currency code, and the Local exchange rate. The Load mapping then uses the local currency code and rate to derive the local amount. The load mapping also fetches the Global Currencies and looks up the corresponding exchange rates. The lookup of exchange rates is done via the Exchange Rate Dimension provided as a Common/Conforming Dimension in OBIA. The Exchange Rate Dimension stores the exchange rates between various currencies for a date range and Rate Type. Two physical tables W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are used to provide the lookups and conversions between currencies. The data is loaded from the source system’s Ledger tables. W_EXCH_RATE_G stores the exchange rates between currencies with a date range. On the other hand, W_GLOBAL_EXCH_RATE_G stores the currency conversions between the document currency and the pre-defined five Global Currencies for each day. Based on the requirements, the fact mappings can decide and use one or both tables to do the conversion. Currency design in OBIA also taps into the MLS and Domain architecture, thus allowing the users to map the currencies to a universal Domain during the implementation time. This is especially important for companies deploying and using OBIA with multiple source adapters. Some Gotchas to Look for It is necessary to think through the currencies during the initial implementation. 1) Identify various types of currencies that are used by your business. Understand what will be your Local (or Base) and Documentation currency. Identify various global currencies that your users will want to look at the reports. This will be based on the global nature of your business. Changes to these currencies later in the project, while permitted, but may cause Full data loads and hence lost time. 2) If the user has a multi source system make sure that the Global Currencies and Global Rate Types chosen in Configuration Manager do have the corresponding source specific counterparts. In other words, make sure for every DW specific value chosen for Currency Code or Rate Type, there is a source Domain mapping already done. Technical Section This section will briefly mention the technical scenarios employed in the OBIA adaptors to extract data from each source system. In OBIA, we have two main tables which store the Currency Rate information as explained in previous sections. W_EXCH_RATE_G and W_GLOBAL_EXCH_RATE_G are the two tables. W_EXCH_RATE_G stores all the Currency Conversions present in the source system. It captures data for a Date Range. W_GLOBAL_EXCH_RATE_G has Global Currency Conversions stored at a Daily level. However the challenge here is to store all the 5 Global Currency Exchange Rates in a single record for each From Currency. Let’s voyage further into the Source System Extraction logic for each of these tables and understand the flow briefly. EBS: In EBS, we have Currency Data stored in GL_DAILY_RATES table. As the name indicates GL_DAILY_RATES EBS table has data at a daily level. However in our warehouse we store the data with a Date Range and insert a new range record only when the Exchange Rate changes for a particular From Currency, To Currency and Rate Type. Below are the main logical steps that we employ in this process. (Incremental Flow only) – Cleanup the data in W_EXCH_RATE_G. Delete the records which have Start Date > minimum conversion date Update the End Date of the existing records. Compress the daily data from GL_DAILY_RATES table into Range Records. Incremental map uses $$XRATE_UPD_NUM_DAY as an extra parameter. Generate Previous Rate, Previous Date and Next Date for each of the Daily record from the OLTP. Filter out the records which have Conversion Rate same as Previous Rates or if the Conversion Date lies within a single day range. Mark the records as ‘Keep’ and ‘Filter’ and also get the final End Date for the single Range record (Unique Combination of From Date, To Date, Rate and Conversion Date). Filter the records marked as ‘Filter’ in the INFA map. The above steps will load W_EXCH_RATE_GS. Step 0 updates/deletes W_EXCH_RATE_G directly. SIL map will then insert/update the GS data into W_EXCH_RATE_G. These steps convert the daily records in GL_DAILY_RATES to Range records in W_EXCH_RATE_G. We do not need such special logic for loading W_GLOBAL_EXCH_RATE_G. This is a table where we store data at a Daily Granular Level. However we need to pivot the data because the data present in multiple rows in source tables needs to be stored in different columns of the same row in DW. We use GROUP BY and CASE logic to achieve this. Fusion: Fusion has extraction logic very similar to EBS. The only difference is that the Cleanup logic that was mentioned in step 0 above does not use $$XRATE_UPD_NUM_DAY parameter. In Fusion we bring all the Exchange Rates in Incremental as well and do the cleanup. The SIL then takes care of Insert/Updates accordingly. PeopleSoft:PeopleSoft does not have From Date and To Date explicitly in the Source tables. Let’s look at an example. Please note that this is achieved from PS1 onwards only. 1 Jan 2010 – USD to INR – 45 31 Jan 2010 – USD to INR – 46 PSFT stores records in above fashion. This means that Exchange Rate of 45 for USD to INR is applicable for 1 Jan 2010 to 30 Jan 2010. We need to store data in this fashion in DW. Also PSFT has Exchange Rate stored as RATE_MULT and RATE_DIV. We need to do a RATE_MULT/RATE_DIV to get the correct Exchange Rate. We generate From Date and To Date while extracting data from source and this has certain assumptions: If a record gets updated/inserted in the source, it will be extracted in incremental. Also if this updated/inserted record is between other dates, then we also extract the preceding and succeeding records (based on dates) of this record. This is required because we need to generate a range record and we have 3 records whose ranges have changed. Taking the same example as above, if there is a new record which gets inserted on 15 Jan 2010; the new ranges are 1 Jan to 14 Jan, 15 Jan to 30 Jan and 31 Jan to Next available date. Even though 1 Jan record and 31 Jan have not changed, we will still extract them because the range is affected. Similar logic is used for Global Exchange Rate Extraction. We create the Range records and get it into a Temporary table. Then we join to Day Dimension, create individual records and pivot the data to get the 5 Global Exchange Rates for each From Currency, Date and Rate Type. Siebel: Siebel Facts are dependent on Global Exchange Rates heavily and almost none of them really use individual Exchange Rates. In other words, W_GLOBAL_EXCH_RATE_G is the main table used in Siebel from PS1 release onwards. As of January 2002, the Euro Triangulation method for converting between currencies belonging to EMU members is not needed for present and future currency exchanges. However, the method is still available in Siebel applications, as are the old currencies, so that historical data can be maintained accurately. The following description applies only to historical data needing conversion prior to the 2002 switch to the Euro for the EMU member countries. If a country is a member of the European Monetary Union (EMU), you should convert its currency to other currencies through the Euro. This is called triangulation, and it is used whenever either currency being converted has EMU Triangulation checked. Due to this, there are multiple extraction flows in SEBL ie. EUR to EMU, EUR to NonEMU, EUR to DMC and so on. We load W_EXCH_RATE_G through multiple flows with these data. This has been kept same as previous versions of OBIA. W_GLOBAL_EXCH_RATE_G being a new table does not have such needs. However SEBL does not have From Date and To Date columns in the Source tables similar to PSFT. We use similar extraction logic as explained in PSFT section for SEBL as well. What if all 5 Global Currencies configured are same? As mentioned in previous sections, from PS1 onwards we store Global Exchange Rates in W_GLOBAL_EXCH_RATE_G table. The extraction logic for this table involves Pivoting data from multiple rows into a single row with 5 Global Exchange Rates in 5 columns. As mentioned in previous sections, we use CASE and GROUP BY functions to achieve this. This approach poses a unique problem when all the 5 Global Currencies Chosen are same. For example – If the user configures all 5 Global Currencies as ‘USD’ then the extract logic will not be able to generate a record for From Currency=USD. This is because, not all Source Systems will have a USD->USD conversion record. We have _Generated mappings to take care of this case. We generate a record with Conversion Rate=1 for such cases. Reusable Lookups Before PS1, we had a Mapplet for Currency Conversions. In PS1, we only have reusable Lookups- LKP_W_EXCH_RATE_G and LKP_W_GLOBAL_EXCH_RATE_G. These lookups have another layer of logic so that all the lookup conditions are met when they are used in various Fact Mappings. Any user who would want to do a LKP on W_EXCH_RATE_G or W_GLOBAL_EXCH_RATE_G should and must use these Lookups. A direct join or Lookup on the tables might lead to wrong data being returned. Changing Currency preferences in the Dashboard: In the 796x series, all amount metrics in OBIA were showing the Global1 amount. The customer needed to change the metric definitions to show them in another Currency preference. Project Analytics started supporting currency preferences since 7.9.6 release though, and it published a Tech note for other module customers to add toggling between currency preferences to the solution. List of Currency Preferences Starting from 11.1.1.x release, the BI Platform added a new feature to support multiple currencies. The new session variable (PREFERRED_CURRENCY) is populated through a newly introduced currency prompt. This prompt can take its values from the xml file: userpref_currencies_OBIA.xml, which is hosted in the BI Server installation folder, under :< home>\instances\instance1\config\OracleBIPresentationServicesComponent\coreapplication_obips1\userpref_currencies.xml This file contains the list of currency preferences, like“Local Currency”, “Global Currency 1”,…which customers can also rename to give them more meaningful business names. There are two options for showing the list of currency preferences to the user in the dashboard: Static and Dynamic. In Static mode, all users will see the full list as in the user preference currencies file. In the Dynamic mode, the list shown in the currency prompt drop down is a result of a dynamic query specified in the same file. Customers can build some security into the rpd, so the list of currency preferences will be based on the user roles…BI Applications built a subject area: “Dynamic Currency Preference” to run this query, and give every user only the list of currency preferences required by his application roles. Adding Currency to an Amount Field When the user selects one of the items from the currency prompt, all the amounts in that page will show in the Currency corresponding to that preference. For example, if the user selects “Global Currency1” from the prompt, all data will be showing in Global Currency 1 as specified in the Configuration Manager. If the user select “Local Currency”, all amount fields will show in the Currency of the Business Unit selected in the BU filter of the same page. If there is no particular Business Unit selected in that filter, and the data selected by the query contains amounts in more than one currency (for example one BU has USD as a functional currency, the other has EUR as functional currency), then subtotals will not be available (cannot add USD and EUR amounts in one field), and depending on the set up (see next paragraph), the user may receive an error. There are two ways to add the Currency field to an amount metric: In the form of currency code, like USD, EUR…For this the user needs to add the field “Apps Common Currency Code” to the report. This field is in every subject area, usually under the table “Currency Tag” or “Currency Code”… In the form of currency symbol ($ for USD, € for EUR,…) For this, the user needs to format the amount metrics in the report as a currency column, by specifying the currency tag column in the Column Properties option in Column Actions drop down list. Typically this column should be the “BI Common Currency Code” available in every subject area. Select Column Properties option in the Edit list of a metric. In the Data Format tab, select Custom as Treat Number As. Enter the following syntax under Custom Number Format: [$:currencyTagColumn=Subjectarea.table.column] Where Column is the “BI Common Currency Code” defined to take the currency code value based on the currency preference chosen by the user in the Currency preference prompt.

    Read the article

  • VOTE by 20 June for OpenWorld Talk on OWB with Non-Oracle Sources

    - by antonio romero
    OWB/ODI Linkedin Group member Suraj Bang has offered a topic through OpenWorld 2010 Suggest-a-Session at Oracle Mix: Extend ETL to Heterogeneous and Unstructured Data Sources with OWB 11gR2 To vote for this talk to appear, click through to: http://bit.ly/owb_km_openworld and click on the "Vote" button. Abstract follows: Beyond basic Oracle-to-Oracle ETL, data warehousing customers need to integrate data from multiple data sources spanning multiple database vendors, file formats(csv, xml, html) and unstructured data sources like pdf's and log files. This session describes experiences extending OWB 11gR2 to extract data from Postgres, SQL Server, MySQL and Sybase, PDF documents, and more for a major banking client's data warehousing project supporting IT operations. This included metadata extraction, custom knowledge module-based ETL and replacing ad-hoc perl and java extraction code with a manageable ETL solution built on OWB's extensible plaform. Note: You must vote for at least two other talks for your vote to count, so if you haven’t already picked your three, also consider: Case Study: Real-Time data warehousing and fraud detection with Oracle 11gR2.

    Read the article

  • How to make chrome.tabs.update works with content script

    - by user1673772
    I work on a little extension on Google Chrome, I want to create a new tab, go on the url "sample"+i+".com", launch a content script on this url, update the current tab to "sample"+(i+1)+".com", and launch the same script. I looked the Q&A available on stackoverflow and I google it but I didn't found a solution who works. This is my actually code of background.js (it works), it creates two tabs (i=21 and i=22) and load my content script for each url, when I tried to do a chrome.tabs.update Chrome launchs directly a tab with i = 22 (and the script works only one time) : function extraction(tab) { for (var i =21; i<23;i++) { chrome.storage.sync.set({'extraction' : 1}, function() {}); //for my content script chrome.tabs.create({url: "http://example.com/"+i+".html"}, function() {}); } } chrome.browserAction.onClicked.addListener(function(tab) {extraction(tab);}); If anyone can help me, the content script and manifest.json are not the problem. I want to make that 15000 times so I can't do otherwise. Thank you.

    Read the article

  • Prevent Windows Exporer to extract metadata

    - by olafure
    Windows Explorer (windows 7 x64) crashes when it sees allegedly corrupted .wav files. I'm dealing with this problem and the hotfix doesn't work for me: http://support.microsoft.com/kb/976417/en-us The hotfix says that this happens if the .wav file is corrupt (which btw I don't think it is). What makes this even worse is that I can't access the file in any program! As soon as the open dialog sees the file, windows tries it's metadata extraction trick and exporer.exe halts. So my question: Can I by any means tell windows to stop this "metadata extraction" action ? (I have seen multiple problems associated with it in the past).

    Read the article

  • UI not updated while using ProgressMonitorInputStream in Swing to monitor compressed file decompress

    - by Bozhidar Batsov
    I'm working on swing application that relies on an embedded H2 database. Because I don't want to bundle the database with the app(the db is frequently updated and I want new users of the app to start with a recent copy), I've implemented a solution which downloads a compressed copy of the db the first time the application is started and extracts it. Since the extraction process might be slow I've added a ProgressMonitorInputStream to show to progress of the extraction process - unfortunately when the extraction starts, the progress dialog shows up but it's not updated at all. It seems like to events are getting through to the event dispatch thread. Here is the method: public static String extractDbFromArchive(String pathToArchive) { if (SwingUtilities.isEventDispatchThread()) { System.out.println("Invoking on event dispatch thread"); } // Get the current path, where the database will be extracted String currentPath = System.getProperty("user.home") + File.separator + ".spellbook" + File.separator; LOGGER.info("Current path: " + currentPath); try { //Open the archive FileInputStream archiveFileStream = new FileInputStream(pathToArchive); // Read two bytes from the stream before it used by CBZip2InputStream for (int i = 0; i < 2; i++) { archiveFileStream.read(); } // Open the gzip file and open the output file CBZip2InputStream bz2 = new CBZip2InputStream(new ProgressMonitorInputStream( null, "Decompressing " + pathToArchive, archiveFileStream)); FileOutputStream out = new FileOutputStream(ARCHIVED_DB_NAME); LOGGER.info("Decompressing the tar file..."); // Transfer bytes from the compressed file to the output file byte[] buffer = new byte[1024]; int len; while ((len = bz2.read(buffer)) > 0) { out.write(buffer, 0, len); } // Close the file and stream bz2.close(); out.close(); } catch (FileNotFoundException e) { e.printStackTrace(); } catch (IOException ex) { ex.printStackTrace(); } try { TarInputStream tarInputStream = null; TarEntry tarEntry; tarInputStream = new TarInputStream(new ProgressMonitorInputStream( null, "Extracting " + ARCHIVED_DB_NAME, new FileInputStream(ARCHIVED_DB_NAME))); tarEntry = tarInputStream.getNextEntry(); byte[] buf1 = new byte[1024]; LOGGER.info("Extracting tar file"); while (tarEntry != null) { //For each entry to be extracted String entryName = currentPath + tarEntry.getName(); entryName = entryName.replace('/', File.separatorChar); entryName = entryName.replace('\\', File.separatorChar); LOGGER.info("Extracting entry: " + entryName); FileOutputStream fileOutputStream; File newFile = new File(entryName); if (tarEntry.isDirectory()) { if (!newFile.mkdirs()) { break; } tarEntry = tarInputStream.getNextEntry(); continue; } fileOutputStream = new FileOutputStream(entryName); int n; while ((n = tarInputStream.read(buf1, 0, 1024)) > -1) { fileOutputStream.write(buf1, 0, n); } fileOutputStream.close(); tarEntry = tarInputStream.getNextEntry(); } tarInputStream.close(); } catch (Exception e) { } currentPath += "db" + File.separator + DB_FILE_NAME; if (!currentPath.isEmpty()) { LOGGER.info("DB placed in : " + currentPath); } return currentPath; } This method gets invoked on the event dispatch thread (SwingUtilities.isEventDispatchThread() returns true) so the UI components should be updated. I haven't implemented this as an SwingWorker since I need to wait for the extraction anyways before I can proceed with the initialization of the program. This method get invoked before the main JFrame of the application is visible. I don't won't a solution based on SwingWorker + property changed listeners - I think that the ProgressMonitorInputStream is exactly what I need, but I guess I'm not doing something right. I'm using Sun JDK 1.6.18. Any help would be greatly appreciated.

    Read the article

  • String Manipultation - Get String between two other Strings?

    - by Ben
    I have a large piece of text in which there is something simular to this: !#_KT_#!COMMANDHERE!#_KT_#! I want, in VB.Net, to get the 'COMMANDHERE' part of the string, how would I go about doing this? I have this so far: Dim temp As String = WebBrowser1.Document.Body.ToString Dim startIndex As Integer = temp.IndexOf("!#__KT__#!") + 1 Dim endIndex As Integer = temp.IndexOf("!#__KT__#!", startIndex) Dim extraction As String = temp.Substring(startIndex, endIndex - startIndex).Trim TextBox1.Text = extraction However this only removes the LAST string eg: #_KT_#! COMMAND. Any help is appreciated!

    Read the article

  • Extracting data points from a matrix and saving them in different matrixes in MATLAB

    - by Hossein
    Hi, I have a 2D Matrix consisting of some coordinates as below(example): Data(X,Y): 45.987543423,5.35000964 52.987544223,5,98765234 Also I have an array consisting of some integers =0 , for example: Cluster(M) 2,0,3,1 each of these numbers in this array corresponds with a row of my 2D Matrix above.For example, it says that row one(coordinate) in the Data Matirx belongs to the cluster 2,second row belongs to cluster 0 and so on. Now I want to have each of the datapoint of each cluster in a separate matrix, for example I want to save datapoints belonging to cluster 1 in a separate matrix, cluster 2 in a separate matrix and so on,.... I can do them manually, but the problem is this has to be an automatic extraction. which means that the number of clusters(range of the numbers in the cluster array varies in each run) so I have to have a general algorithm that does this extraction for me. Can someone help me please? thanks

    Read the article

  • [VB.Net] String Manipultation - Get String between two other Strings?

    - by Ben
    I have a large piece of text in which there is something simular to this: !#_KT_#!COMMANDHERE!#_KT_#! I want, in VB.Net, to get the 'COMMANDHERE' part of the string, how would I go about doing this? I have this so far: Dim temp As String = WebBrowser1.Document.Body.ToString Dim startIndex As Integer = temp.IndexOf("!#__KT__#!") + 1 Dim endIndex As Integer = temp.IndexOf("!#__KT__#!", startIndex) Dim extraction As String = temp.Substring(startIndex, endIndex - startIndex).Trim TextBox1.Text = extraction However this only removes the LAST string eg: #_KT_#! COMMAND. Any help is appreciated!

    Read the article

  • How to identify deadlock conditions in a third-party application?

    - by Imhotep is Invisible
    I am using a third-party application to handle batch CD audio extraction via multiple FireWire attached devices, but the application frequently (though non-deterministically) hangs during the extraction. I suspect that the multithreaded application is deadlocking over some shared resource. The developer, however, suspects the problem lies elsewhere but is not addressing the problem at this time. I would like to be able to do some legwork on my end to a) prove the condition exists and b) ideally point him in the right direction. The problems: while I used to be a programmer, it's been awhile and I need to shake off the dust (last work I did was back in '99 and it was under Solaris, while the application runs under XP). Rather than there being a dearth of information online, there's almost too much to digest. Are there any suggested guides or tutorials that might help me get back up to speed sufficient enough to help identify and/or diagnose the deadlock, or are there tools or approaches that I should study up on to aid me in my task? Many thanks for all suggestions!

    Read the article

  • SQL Server Editions and Integration Services

    The SQL Server 2005 and SQL Server 2008 product family has quite a few editions now, so what does this mean for SQL Server Integration Services? Starting from the bottom we have the free edition known as Express, and the entry level Workgroup edition, as well as the new Web edition. None of these three include the full SSIS product, but they do all include the SQL Server Import and Export Wizard, with access to basic data sources but nothing more, so for simple loading and extraction of data this should suffice. You will not be able to build packages though, this is just a one shot deal aimed at using the wizard on an ad-hoc basis. To get the full power of Integration Services you need to start with Standard edition. This includes the BI Development Studio, for building your own packages, and fully functional IDE integrated into Visual Studio. (You get the full VS 2005/2008 IDE with the product). All core functions will be available but with a restricted set of transformations and tasks. The SQL Server 2005 Features Comparison or Features Supported by the Editions of SQL Server 2008 describes standard edition as having basic transforms, compared to Enterprise which includes the advanced transforms. I think basic is a little harsh considering the power you get with Standard, but the advanced covers the truly ground-breaking capabilities of data mining, text mining and cleansing or fuzzy transforms. The power of performing these operations within your ETL pipeline should not be underestimated, but not all processes will require these capabilities, so it seems like a reasonable delineation. Thankfully there are no feature limitations or artificial governors within Standard compared to Enterprise. The same control flow and data flow engines underpin both editions, with the same configuration and deployment options allowing you to work seamlessly between environments and editions if using the common components. In fact there are no govenors at all in SSIS, so whilst the SQL Database engine is limited to 4 CPUs in Standard edition, SSIS is only limited by the base operating system. The advanced transforms only available with Enterprise edition: Data Mining Training Destination Data Mining Query Component Fuzzy Grouping Fuzzy Lookup Term Extraction Term Lookup Dimension Processing Destination Partition Processing Destination The advanced tasks only available with Enterprise edition: Data Mining Query Task So in summary, if you want SQL Server Integration Services, you need SQL Server Standard edition, and for the more advanced tasks and transforms you need SQL Server Enterprise edition. To recap, the answer to the often asked question is no, SQL Server Integration Services is not available in SQL Server Express or Workgroup editions.

    Read the article

  • Microsoft Semantic Search

    - by sqlartist
    This is something I really get excitied about - Microsoft Semantic Search. There is an excellent PDC demo and presentation here - http://microsoftpdc.com/Sessions/SVR32 . Intially I didn't think this was SQL related but I read that it may be included in future versions of SQL Server. For many years I have written linguistic, semantic, text extraction & clustering code in SQL Server for fun - now finally I can throw that all away and use this tool :) It reminds me of the Microsoft Research...(read more)

    Read the article

  • Qt Creator 2.5 est sorti en beta, l'EDI supporte plus de fonctionnalités de C++11

    Suite à la sortie de Qt Creator 2.5 en beta, il est grand temps de faire le tour de quelques nouveautés, sans toutes les passer en revue. C++11 Publié en septembre dernier, le standard ISO C++11 se doit d'avoir un meilleur support dans l'EDI ; notamment, on trouvera les mots-clés nullptr, constexpr, static_assert, noexcept et auto, ainsi que les espaces de noms en ligne et les lambdas (partiellement). De même, quelques nouvelles actions de refactorisation sont disponibles : insertion d'un #include pour les identifiants indéfinis, extraction de fonction, réarrangement de liste de paramètres, synchronisa...

    Read the article

  • The Latest Developments with Oracle's Report Tool, XML Publisher

    Rich Colton, Application Integration Manager for Washington Group International (WGI) and Tim Dexter, XML Publisher Group Product Manager speak with Cliff about the Enterprise release of XML Publisher, the new extraction engine that allows developers to create reports that access multiple databases and datasources and WGI's XML strategy and benefits with for their business applications.

    Read the article

  • SQL Server 2008 and 2008 R2 Integration Services - Managing Local Processes Using Script Task

    SQL Server 2008 R2 Integration Services includes a number of predefined tasks that implement common administrative actions to help with data extraction, transformation and loading (ETL). While in a majority of cases they are sufficient to deliver required functionality, there might be situations where an extra level of flexibility is desired. NEW! SQL Monitor 2.0Monitor SQL Server Central's servers withRed Gate's new SQL Monitor.No installation required. Find out more.

    Read the article

  • SQL Server 2012 Integration Services - Implementing Package Security using Access Control

    SQL Server 2012 Integration Services offers a wide range of powerful features that allow you to streamline and automate tasks involving data extraction, transformation, and loading. However, incorporating these features into your existing business intelligence framework frequently necessitates additional security measures ensuring that data which is being processed remains protected from unauthorized access.

    Read the article

  • Fix Linux-made png file for use on Windows

    - by BGM
    There is a particular icon library that I really like. Now, I have downloaded the package that has the png files inside (I know the ico files are there, but I want the png files). However, about my Windows 7 computer tells me that about 1/3 of the png files are corrupt. I usually use XnView to view the files, and it won't display the "corrupt" files. I've tried other editors and viewers and I get the same issue. Now, the png package was originally designed for Linux to be an OS-icon-package for the entire system, so I figure the png files were built in Linux. So, is there a way I can "fix" the "corrupted" png files for my Windows 7 computer? Maybe when the files were created there was some bit that was off-colour or something? Any clues? [edit] I have read in this thread that the "corruption" could happen during the extraction process. I did all the extraction with 7-zip. It was a zip containing a tar. I will try another extractor, but I don't think it will make any difference.

    Read the article

  • JDBC Triggers

    - by Tim Dexter
    Received a question from a customer last week, they were using the new rollup patch on top of 10.1.3.4.1. What are these boxes for? Don't you know? Surely? Well, they are for ... that new functionality, you know it's in the user docs, that thingmabobby doodah. OK, I dont know either, I can have a guess but let me check first. Serveral IM sessions, emails and a dig through the readme for the new patch and I had my answer. Its not in the official documentation, yet. Leslie is on the case. The two fields were designed to allow an Admin to set a users context attributes before a connection is made to a database and for un-setting the attributes after the connection is broken by the extraction engine. We got a sample from the Enterprise Manager team on how they will be using it with their VPD connections. FUNCTION bip_to_em_user (user_name_in IN VARCHAR2) RETURN BOOLEAN IS BEGIN SETEMUSERCONTEXT(user_name_in, MGMT_USER.OP_SET_IDENTIFIER); return TRUE; END bip_to_em_user; And used in the jdbc data source definition like this (pre-process function): sysman.mgmt_bip.bip_to_em_user(:xdo_user_name) You, of course can call any function that is going to return a boolean value, another example might be. FUNCTION set_per_process_username (username_in IN VARCHAR2) RETURN BOOLEAN IS BEGIN SETUSERCONTEXT(username_in); return TRUE; END set_per_process_username Just use your own function/package to set some user context. Very grateful for the mail from Leslie on the EM team's usage but I had to try it out. Rather than set up a VPD, I opted for a simpler test. Can I log the comings and goings of users and their queries using the same pre-process text box. Reaching back into the depths of my developer brain to remember some pl/sql, it was not that deep and I came up with: CREATE OR REPLACE FUNCTION BIPTEST (user_name_in IN VARCHAR2, smode IN VARCHAR2) RETURN BOOLEAN AS BEGIN INSERT INTO LOGTAB VALUES(user_name_in, sysdate,smode); RETURN true; END BIPTEST; To call it in the pre-fetch trigger. BIPTEST(:xdo_user_name) Not going to set the pl/sql world alight I know, but you get the idea. As a new connection is made to the database its logged in the LOGTAB table. The SMODE value just sets if its an entry or an exit. I used the pre- and post- boxes. NAME UPDATE_DATE S_FLAG oracle 14-MAY-10 09.51.34.000000000 AM Start oracle 14-MAY-10 10.23.57.000000000 AM Finish administrator 14-MAY-10 09.51.38.000000000 AM Start administrator 14-MAY-10 09.51.38.000000000 AM Finish oracle 14-MAY-10 09.51.42.000000000 AM Start oracle 14-MAY-10 09.51.42.000000000 AM Finish It works very well, I had some fun trying to find a nasty query for the extraction engine so that the timestamps from in to out actually had a difference. That engine is fast! The only derived value you can pass from BIP is :xdo_user_name. None of the other server values are available. Connection pools are not currently supported but planned for a future release. Now you know what those fields are for and look for some official documentation, rather than my ramblings, coming soon!

    Read the article

  • Fill a Flash Drive with Portable Software using Lupo PenSuite

    - by Asian Angel
    A flash drive full of portable software is helpful to have along wherever you go. The Lupo PenSuite lets you choose from three different versions to get the best fit for your everyday needs. Note: If running the full version you will need a 512 MB USB flash drive or larger. Using Lupo PenSuite The one window to watch for during the setup process is where you have the opportunity to add a specific language pack if needed. Outside of that all that you need to do is sit back and wait for the suite to be extracted. Note: Extraction times will vary based on version and extraction location. Here we browsed to our flash drive to extract it to… Once the setup process is complete locate and double click the Lupo_PenSuite.exe file. This one time window will present you the opportunity to start using the suite immediately, or go directly into the options. When the suite is active you will have a new system tray icon that operates as a start menu button. At the bottom you can monitor the remaining room on your flash drive, and use the close button to exit the suite (may display as a power button based on menu theme). A quick look at the set up inside the suite. There is a pre-configured area for organizing and storing your personal files. Prefer a classic style menu? Just select for it in the options (various tab) and enjoy a smaller streamlined look. Note: You can also change the theme for the regular menu and add a user pic. The suite provides access to your portable software and online sites. You get to enjoy the best of both as shown in the following examples. Websites will open using the suite’s portable Firefox install. VLC is ready to play your downloaded videos. The suite also has some very nice photo editing programs added in. Installing Additional Apps If one of your favorite programs is not included in the suite version, it only takes a few minutes to add it in. Go to the Additional Apps webpage, download the app(s), and extract them onto your hard-drive. Note: Link for additional apps webpage provided below. Add the extracted app(s) to the MyApps folder in the suite’s folder hierarchy. Click on ASuite in the suite’s start menu. Drag and drop the portable app’s exe file into the MyApps section in the ASuite window. Your new software’s shortcut should display as shown here. Close this window when finished. Checking the suite’s start menu will show your new software ready to be used. Conclusion If you need a good portable software collection to carry with you on a flash drive then Lupo PenSuite is definitely worth taking a look at. We tested Lupo PenSuite on XP, Vista, and Windows 7 and it works great on all three. Another popular choice is PortableApps and you can check out our Review of that too they are essentially the same thing, each is just packaged differently. Links Download Lupo PenSuite (Full, Lite, & Zero versions) *Download links approximately one-third down the page. Download Additional Apps for Lupo PenSuite Download Additional Skins for Lupo PenSuite Start Menu View Video Tutorials *Has tutorial for easy updating of entire suite. Similar Articles Productive Geek Tips Install and Run Applications from Your iPod, Flash Drive or Mp3 PlayerRebit Backup Software [Review]BitLocker To Go Encrypts Portable Flash Drives in Windows 7Create a Bootable Ubuntu USB Flash Drive the Easy WaySpeed up Your Windows Vista Computer with ReadyBoost TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 VMware Workstation 7 Google TV The iPod Revolution Ultimate Boot CD can help when disaster strikes Windows Firewall with Advanced Security – How To Guides Sculptris 1.0, 3D Drawing app AceStock, a Tiny Desktop Quote Monitor

    Read the article

  • Best Design Pattern for Coupling User Interface Components and Data Structures

    - by szahn
    I have a windows desktop application with a tree view. Due to lack of a sound data-binding solution for a tree view, I've implemented my own layer of abstraction on it to bind nodes to my own data structure. The requirements are as follows: Populate a tree view with nodes that resemble fields in a data structure. When a node is clicked, display the appropriate control to modify the value of that property in the instance of the data structure. The tree view is populated with instances of custom TreeNode classes that inherit from TreeNode. The responsibility of each custom TreeNode class is to (1) format the node text to represent the name and value of the associated field in my data structure, (2) return the control used to modify the property value, (3) get the value of the field in the control (3) set the field's value from the control. My custom TreeNode implementation has a property called "Control" which retrieves the proper custom control in the form of the base control. The control instance is stored in the custom node and instantiated upon first retrieval. So each, custom node has an associated custom control which extends a base abstract control class. Example TreeNode implementation: //The Tree Node Base Class public abstract class TreeViewNodeBase : TreeNode { public abstract CustomControlBase Control { get; } public TreeViewNodeBase(ExtractionField field) { UpdateControl(field); } public virtual void UpdateControl(ExtractionField field) { Control.UpdateControl(field); UpdateCaption(FormatValueForCaption()); } public virtual void SaveChanges(ExtractionField field) { Control.SaveChanges(field); UpdateCaption(FormatValueForCaption()); } public virtual string FormatValueForCaption() { return Control.FormatValueForCaption(); } public virtual void UpdateCaption(string newValue) { this.Text = Caption; this.LongText = newValue; } } //The tree node implementation class public class ExtractionTypeNode : TreeViewNodeBase { private CustomDropDownControl control; public override CustomControlBase Control { get { if (control == null) { control = new CustomDropDownControl(); control.label1.Text = Caption; control.comboBox1.Items.Clear(); control.comboBox1.Items.AddRange( Enum.GetNames( typeof(ExtractionField.ExtractionType))); } return control; } } public ExtractionTypeNode(ExtractionField field) : base(field) { } } //The custom control base class public abstract class CustomControlBase : UserControl { public abstract void UpdateControl(ExtractionField field); public abstract void SaveChanges(ExtractionField field); public abstract string FormatValueForCaption(); } //The custom control generic implementation (view) public partial class CustomDropDownControl : CustomControlBase { public CustomDropDownControl() { InitializeComponent(); } public override void UpdateControl(ExtractionField field) { //Nothing to do here } public override void SaveChanges(ExtractionField field) { //Nothing to do here } public override string FormatValueForCaption() { //Nothing to do here return string.Empty; } } //The custom control specific implementation public class FieldExtractionTypeControl : CustomDropDownControl { public override void UpdateControl(ExtractionField field) { comboBox1.SelectedIndex = comboBox1.FindStringExact(field.Extraction.ToString()); } public override void SaveChanges(ExtractionField field) { field.Extraction = (ExtractionField.ExtractionType) Enum.Parse(typeof(ExtractionField.ExtractionType), comboBox1.SelectedItem.ToString()); } public override string FormatValueForCaption() { return string.Empty; } The problem is that I have "generic" controls which inherit from CustomControlBase. These are just "views" with no logic. Then I have specific controls that inherit from the generic controls. I don't have any functions or business logic in the generic controls because the specific controls should govern how data is associated with the data structure. What is the best design pattern for this?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >