Search Results

Search found 4836 results on 194 pages for 'execution plans'.

Page 11/194 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • DCOGS Balance Breakup Diagnostic in OPM Financials

    - by ChristineS-Oracle
    Purpose of this diagnostic (OPMDCOGSDiag.sql) is to identify the sales orders which constitute the Deferred COGS account balance.This will help to get the detailed transaction information for Sales Order/s Order Management, Account Receivables, Inventory and OPM financials sub ledger at the Organization level.  This script is applicable for various scenarios of Standard Sales Order, Return Orders (RMA) coupled with all the applicable OPM costing methods like Standard, Actual and Lot costing.  OBJECTIVE: The sales order(s) which are at different stages of their life cycle in one spreadsheet at one go. To collect the information of: This will help in: Lesser time for data collection. Faster diagnosis of the issue. Easy collaboration across different modules like  Order Management, Accounts Receivables, Inventory and Cost Management.  You can download the script from Doc ID 1617599.1 DCOGS Balance Breakup (SO/RMA) and Diagnostic Analyzer in OPM Financials.

    Read the article

  • Webcast: Introduction To Causal Factors

    - by ChristineS-Oracle
    Webcast: Introduction To Causal Factors Date: June 11, 2014 at 11:00 am ET, 10:00 am CT, 9:00 am MT, 8:00 am PT, 8:30 pm, India Time (Mumbai, GMT+05:30) This one hour advisor webcast will provide an introduction to causal factors for Demand Management and AFDM. Pre-seeded causal factors will be discussed as well as when they are not appropriate. Scenarios of when to add causal factors will be covered and best practice method of adding and using. Topics will include: Causal factors in DM and AFDM Pre-seeded causal factors When to modify causal factor settings Best practice when working with causal factors Details & Registration: Doc ID 1664606.1

    Read the article

  • Mouse Clicks, Reactive Extensions and StreamInsight Mashup

    I had an hour spare this afternoon so I wanted to have another play with Reactive Extensions in .Net and StreamInsight.  I also didn’t want to simply use a console window as a way of gathering events so I decided to use a windows form instead. The task I set myself was this. Whenever I click on my form I want to subscribe to the event and output its location to the console window and also the timestamp of the event.  In addition to this I want to know for every mouse click I do, how many mouse clicks have happened in the last 5 seconds. The second point here is really interesting.  I have often found this when working with people on problems.  It is how you ask the question that determines how you tackle the problem.  I will show 2 ways of possibly answering the second question depending on how the question was interpreted. As a side effect of this example I will show how time in StreamInsight can stand still.  This is an important concept and we can see it in the output later. Now to the code.  I will break it all down in this blogpost but you can download the solution and see it all together. I created a Console application and then instantiate a windows form.   frm = new Form(); Thread g = new Thread(CallUI); g.SetApartmentState(ApartmentState.STA); g.Start();   Call UI looks like this   static void CallUI() { System.Windows.Forms.Application.Run(frm); frm.Activate(); frm.BringToFront(); }   Now what we need to do is create an observable from the MouseClick event on the form.  For this we use the Reactive Extensions.   var lblevt = Observable.FromEvent<MouseEventArgs>(frm, "MouseClick").Timestamp();   As mentioned earlier I have two objectives in this example and to solve the first I am going to again use the Reactive extensions.  Let’s subscribe to the MouseClick event and output the location and timestamp to the console. lblevt.Subscribe(evt => { Console.WriteLine("Clicked: {0}, {1} ", evt.Value.EventArgs.Location,evt.Timestamp); }); That should take care of obective #1 but what about the second objective.  For that we need some temporal windowing and this means StreamInsight.  First we need to turn our Observable collection of MouseClick events into a PointStream Server s = Server.Create("Default"); Microsoft.ComplexEventProcessing.Application a = s.CreateApplication("MouseClicks"); var input = lblevt.ToPointStream( a, evt => PointEvent.CreateInsert( evt.Timestamp, new { loc = evt.Value.EventArgs.Location.ToString(), ts = evt.Timestamp.ToLocalTime().ToString() }), AdvanceTimeSettings.IncreasingStartTime);   Now that we have created out PointStream we need to do something with it and this is where we get to our second objective.  It is pretty clear that we want some kind of windowing but what? Here is one way of doing it.  It might not be what you wanted but again it is how the second objective is interpreted   var q = from i in input.TumblingWindow(TimeSpan.FromSeconds(5), HoppingWindowOutputPolicy.ClipToWindowEnd) select new { CountOfClicks = i.Count() };   The above code creates tumbling windows of 5 seconds and counts the number of events in the windows.  If there are no events in the window then no result is output.  Likewise until an event (MouseClick) is issued then we do not see anything in the output (that is not strictly true because it is the CTI strapped to our MouseClick events that flush the events through the StreamInsight engine not the events themselves).  This approach is centred around the windows and not the events.  Until the windows complete and a CTI is issued then no events are pushed through. An alternate way of answering our second question is below   var q = from i in input.AlterEventDuration(evt => TimeSpan.FromSeconds(5)).SnapshotWindow(SnapshotWindowOutputPolicy.Clip) select new { CountOfClicks = i.Count() };   In this code we extend the duration of each MouseClick to five seconds.  We then create  Snapshot Windows over those events.  Snapshot windows are discussed in detail here.  With this solution we are centred around the events.  It is the events that are driving the output.  Let’s have a look at the output from this solution as it may be a little confusing. First though let me show how we get the output from StreamInsight into the Console window. foreach (var x in q.ToPointEnumerable().Where(e => e.EventKind != EventKind.Cti)) { Console.WriteLine(x.Payload.CountOfClicks); }   Ok so now to the output.   The table at the top shows the output from our routine and the table at the bottom helps to explain the output.  One of the things that will help as well is, you will note that for our PointStream we set the issuing of CTIs to be IncreasingStartTime.  What this means is that the CTI is placed right at the start of the event so will not flush the event with which it was issued but will flush those prior to it.  In the bottom table the Blue fill is where we issued a click.  Yellow fill is the duration and boundaries of our events.  The numbers at the bottom indicate the count of events   Clicked 22:40:16                                 Clicked 23:40:18                                 1                                   Clicked 23:40:20                                 2                                   Clicked 23:40:22                                 3                                   2                                   Clicked 23:40:24                                 3                                   2                                   Clicked 23:40:32                                 3                                   2                                   1                                                                                                         secs 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32                                                                                                                                                                                                                         counts   1   2 3 2 3 2 3   2   1           What we can see here in the output is that the counts include all the end edges that have occurred between the mouse clicks.  If we look specifically at the mouse click at 22:40:32. then we see that 3 events are returned to us. These include the following End Edge count at 22:40:25 End Edge count at 22:40:27 End Edge count at 22:40:29 Another thing we notice is that until we actually issue a CTI at 22:40:32 then those last 3 snapshot window counts will never be reported. Hopefully this has helped to explain  a few concepts around StreamInsight and the IObservable() pattern.   You can download this solution from here and play.  You will need the Reactive Framework from here and StreamInsight 1.1

    Read the article

  • Webcast: Flow Manufacturing Work Order-less Completion

    - by ChristineS-Oracle
    Webcast: Flow Manufacturing Work Order-less Completion Date: August 27, 2014 at 11:00 am ET, 10:00 am CT, 9:00 am MT, 8:00 am PT, 8:30 pm, India Time (Mumbai, GMT+05:30) This advisor webcast is intended for technical and functional users who want to understand the Flow Manufacturing Work Order-less Completion and its Transaction types. This presentation will include the required setups and provide details of the business process. Topics will include: Overview of Flow Manufacturing and Integration with Work In Process Overview of Work Order-less Completion Basic Setup Steps Transactions performed in Work Order-less Completion form Details & Registration: Doc ID 1906749.1

    Read the article

  • Create MSDB Folders Through Code

    You can create package folders through SSMS, but you may also wish to do this as part of a deployment process or installation. In this case you will want programmatic method for managing folders, so how can this be done? The short answer is, go and look at the table msdb.dbo. sysdtspackagefolders90. This where folder information is stored, using a simple parent and child hierarchy format. To add new folder directly we just insert into the table - INSERT INTO dbo.sysdtspackagefolders90 ( folderid ,parentfolderid ,foldername) VALUES ( NEWID() -- New GUID for our new folder ,<<Parent Folder GUID>> -- Lookup the parent folder GUID if a child or another folder, or use the root GUID 00000000-0000-0000-0000-000000000000 ,<<Folder Name>>) -- New folder name There are also some stored procedures - sp_dts_addfolder sp_dts_deletefolder sp_dts_getfolder sp_dts_listfolders sp_dts_renamefolder To add a new folder to the root we could call the sp_dts_addfolder to stored procedure - EXEC msdb.dbo.sp_dts_addfolder @parentfolderid = '00000000-0000-0000-0000-000000000000' -- Root GUID ,@name = 'New Folder Name The stored procedures wrap very simple SQL statements, but provide a level of security, as they check the role membership of the user, and do not require permissions to perform direct table modifications.

    Read the article

  • Enterprise Trade Compliance: Changing Trade Operations around the World

    - by John Murphy
    We live in a world of incredible bounty and speed where any product can be delivered anywhere on earth. However, our world is also filled with challenges for business – where volatility, uncertainty, risk, and chaos are our daily companions. To prosper amid the realities of this new world, organizations cannot rely on old strategies; they need new business models. Key trends within the global economy are mandating that companies fully integrate global trade management best practices within broader supply chain management strategies, rather than simply leaving it as a discrete event at the end of the order or procurement cycle. To explain, many companies face a complicated and changing compliance environment. This is directly linked to the speed and configuration of the supply chain, particularly with the explosion of new markets, shorter service cycles and ship times, accelerating rates of globalization and outsourcing, and increasing product complexity and regulation. Read More...

    Read the article

  • TAKE Solutions Implements Oracle Mobile Supply Chain Applications for Leading Housewares Manufacturer

    - by John Murphy
    TAKE Solutions Ltd. [BSE: 532890 | NSE: TAKE], a leader in the Supply Chain Management and Life Sciences domains, today announced the successful implementation of Oracle Mobile Supply Chain Applications (MSCA®) for a leading manufacturer of household goods. Leveraging TAKE’s more than 15 years of expertise with the Oracle® E-business Suite products, the customer has achieved real-time inventory visibility into manufacturing, put-away and customer shipments. TAKE also implemented location control and cycle counting to provide additional visibility and inventory accuracy. http://www.virtual-strategy.com/2012/06/05/take-solutions-implements-oracle-mobile-supply-chain-applications-leading-housewares-manu

    Read the article

  • Oracle Transportation User Conference Agenda Released

    - by John Murphy
    The Oracle Transportation Management (OTM) User Conference agenda is now available.   The event brings together users, implementers and prospective customers of OTM.   The event is held annually in Philadelphia with this year's event taking place August 12 - 15.   Follow one of the links to see the complete agenda and to register to attend.  http://otmconference.com/agenda.aspx

    Read the article

  • Process Manufacturing (OPM) Actual Costing Analyzer Diagnostic Script

    - by ChristineS-Oracle
    The OPM Actual Costing Analyzer is a script which you can use proactively at any time to review Setups and pieces of data which are known to affect either the performance or the accuracy of either the OPM Actual Cost process, or Lot Costing.Each topic reviewed by this report has been specifically selected because it points to the solution used to resolve at least two Service Requests during a recent 3-month period. You can download this script from Doc ID 1629384.1, OPM Actual Costing Analyzer Diagnostic Script.

    Read the article

  • Troubleshooting .NET "Fatal Execution Engine Error"

    - by JYelton
    Summary: I periodically get a .NET Fatal Execution Engine Error on an application which I cannot seem to debug. The dialog that comes up only offers to close the program or send information about the error to Microsoft. I've tried looking at the more detailed information but I don't know how to make use of it. Error: The error is visible in Event Viewer under Applications and is as follows: .NET Runtime version 2.0.50727.3607 - Fatal Execution Engine Error (7A09795E) (80131506) The computer running it is Windows XP Professional SP 3. (Intel Core2Quad Q6600 2.4GHz w/ 2.0 GB of RAM) Other .NET-based projects that lack multi-threaded downloading (see below) seem to run just fine. Application: The application is written in C#/.NET 3.5 using VS2008, and installed via a setup project. The app is multi-threaded and downloads data from multiple web servers using System.Net.HttpWebRequest and its methods. I've determined that the .NET error has something to do with either threading or HttpWebRequest but I haven't been able to get any closer as this particular error seems impossible to debug. I've tried handling errors on many levels, including the following in Program.cs: // handle UI thread exceptions Application.ThreadException += new ThreadExceptionEventHandler(Application_ThreadException); // handle non-UI thread exceptions AppDomain.CurrentDomain.UnhandledException += new UnhandledExceptionEventHandler(CurrentDomain_UnhandledException); Application.EnableVisualStyles(); Application.SetCompatibleTextRenderingDefault(false); // force all windows forms errors to go through our handler Application.SetUnhandledExceptionMode(UnhandledExceptionMode.CatchException); More Notes and What I've Tried... Installed Visual Studio 2008 on the target machine and tried running in debug mode, but the error still occurs, with no hint as to where in source code it occurred. When running the program from its installed version (Release) the error occurs more frequently, usually within minutes of launching the application. When running the program in debug mode inside of VS2008, it can run for hours or days before generating the error. Reinstalled .NET 3.5 and made sure all updates are applied. Broke random cubicle objects in frustration. Rewritten parts of code that deal with threading and downloading in attempts to catch and log exceptions, though logging seemed to aggravate the problem (and never provided any data). Question: What steps can I take to troubleshoot or debug this kind of error? Memory dumps and the like seem to be the next step, but I'm not experienced at interpreting them. Perhaps there's something more I can do in the code to try and catch errors... It would be nice if the "Fatal Execution Engine Error" was more informative, but internet searches have only told me that it's a common error for a lot of .NET-related items.

    Read the article

  • Sandboxed Javascript Execution in an Internet Explorer Extension (BHO)

    - by TelegramSam
    Firefox has the Sandbox and evalInSandbox(). Chrome has sandboxed execution in their content scripts (they call it isolated execution). I'm looking for the same thing in an IE browser extension. I can load a javascript file, then call evalScript(), but the code executes in the same environment as javascript that exists on the page. I need a way to run my library (which includes and is based on jQuery) in an sandboxed/isolated environment, but still allow it to modify the DOM as if it were running on the page. Jint looks promising, but cannot currently evaluate jQuery. (They can parse it.) How can I do this?

    Read the article

  • Spring Webflow: cannot get flow execution url at action phase of portlet

    - by tabdulin
    The following exception is thrown: Caused by: java.lang.IllegalStateException: A flow execution action URL can only be obtained in a RenderRequest using a RenderResponse at org.springframework.webflow.context.portlet.PortletExternalContext.getFlowExecutionUrl(PortletExternalContext.java:2 06) at org.springframework.webflow.engine.impl.RequestControlContextImpl.getFlowExecutionUrl(RequestControlContextImpl.java :178) at org.springframework.webflow.mvc.view.AbstractMvcView.render(AbstractMvcView.java:172) at org.springframework.webflow.engine.ViewState.render(ViewState.java:282) at org.springframework.webflow.engine.ViewState.refresh(ViewState.java:241) at org.springframework.webflow.engine.ViewState.resume(ViewState.java:219) at org.springframework.webflow.engine.Flow.resume(Flow.java:545) at org.springframework.webflow.engine.impl.FlowExecutionImpl.resume(FlowExecutionImpl.java:259) ... 62 more It seems for me like resuming execution of flow at action phase tries to do render phase's stuff. Any ideas?

    Read the article

  • SWFUpload is it possible to upload multiple files to a single php script execution

    - by user176333
    Hello, I'm trying to implement SWFUpload into an existing PHP upload funcitonality. My current backend script however expects 2 fiels to be uploaded in a single php script execution. (e.g. it excepts the $_FILES parameters to contain 2 entries). So i'm queueing 2 files with SWFUpload and start uploading them. However it appears SWFLUpload calls the php backend script for each queued file. I'd rather modify SWFUpload to send the files with a single backend script execution instead on having to adjust the backend script. Is anyone familiar with this? I've searched various resources (like the SWFUploads docs and forum, but have not found similiar topics. Thanks in advance

    Read the article

  • Line of code doesnt follow sequential execution

    - by ryudice
    Hi, I'm having a problem with a code that doesnt follow sequential execution although I'm not using threading. My code calls one function and when I'm debugging inside the function, it returns to the line of code following the function call although the function hasnt finished executing, I have no idea why this would happen, any ideas? thanks in advance. workflow.SaveControlTiempo(solEntity, traId, Usuario.GetUsrId());// this is my function RadAjaxManager.GetCurrent(Page).RadAlert("Solicitud Transicionada con \u00c9xito"); // code execution continues here even if the function hasnt finished and since the function hasnt finished I get an exception var javascripFunction = "CloseWindow('Solicitud <b>{0}</b><br />Transicionada con \u00c9xito.<li> <b>Etapa Destino: </b>{1}<li><b>Usuario: </b>{2}');"; javascripFunction = string.Format(javascripFunction, solEntity.SOL_CODIGO, solEntity.WKF_ETP_ETAPAS.ETP_DES, DNNUtil.GetInstance().GetUserName(solEntity.USR_ID));

    Read the article

  • How to repeat a particular execution multiple times

    - by Joshua
    The following snippet generates create / drop sql for a particular database, whenever there is a modification to JPA entity classes. How do I perform something equivalent of a 'for' operation where-in the following code can be used to generate sql for all supported databases (e.g. H2, MySQL, Postgres) Currently I have to modify db.groupId, db.artifactId, db.driver.version everytime to generate the sql files <plugin> <groupId>org.codehaus.mojo</groupId> <artifactId>hibernate3-maven-plugin</artifactId> <version>${hibernate3-maven-plugin.version}</version> <executions> <execution> <id>create schema</id> <phase>process-test-resources</phase> <goals> <goal>hbm2ddl</goal> </goals> <configuration> <componentProperties> <persistenceunit>${app.module}</persistenceunit> <drop>false</drop> <create>true</create> <outputfilename>${app.sql}-create.sql</outputfilename> </componentProperties> </configuration> </execution> <execution> <id>drop schema</id> <phase>process-test-resources</phase> <goals> <goal>hbm2ddl</goal> </goals> <configuration> <componentProperties> <persistenceunit>${app.module}</persistenceunit> <drop>true</drop> <create>false</create> <outputfilename>${app.sql}-drop.sql</outputfilename> </componentProperties> </configuration> </execution> </executions> <dependencies> <dependency> <groupId>org.hibernate</groupId> <artifactId>hibernate-core</artifactId> <version>${hibernate-core.version}</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> <version>${slf4j-api.version}</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-nop</artifactId> <version>${slf4j-nop.version}</version> </dependency> <dependency> <groupId>${db.groupId}</groupId> <artifactId>${db.artifactId}</artifactId> <version>${db.driver.version}</version> </dependency> </dependencies> <configuration> <components> <component> <name>hbm2cfgxml</name> <implementation>annotationconfiguration</implementation> </component> <component> <name>hbm2dao</name> <implementation>annotationconfiguration</implementation> </component> <component> <name>hbm2ddl</name> <implementation>jpaconfiguration</implementation> <outputDirectory>src/main/sql</outputDirectory> </component> <component> <name>hbm2doc</name> <implementation>annotationconfiguration</implementation> </component> <component> <name>hbm2hbmxml</name> <implementation>annotationconfiguration</implementation> </component> <component> <name>hbm2java</name> <implementation>annotationconfiguration</implementation> </component> <component> <name>hbm2template</name> <implementation>annotationconfiguration</implementation> </component> </components> </configuration> </plugin>

    Read the article

  • Problem measuring N times the execution time of a code block

    - by Nazgulled
    EDIT: I just found my problem after writing this long post explaining every little detail... If someone can give me a good answer on what I'm doing wrong and how can I get the execution time in seconds (using a float with 5 decimal places or so), I'll mark that as accepted. Hint: The problem was on how I interpreted the clock_getttime() man page. Hi, Let's say I have a function named myOperation that I need to measure the execution time of. To measure it, I'm using clock_gettime() as it was recommend here in one of the comments. My teacher recommends us to measure it N times so we can get an average, standard deviation and median for the final report. He also recommends us to execute myOperation M times instead of just one. If myOperation is a very fast operation, measuring it M times allow us to get a sense of the "real time" it takes; cause the clock being used might not have the required precision to measure such operation. So, execution myOperation only one time or M times really depends if the operation itself takes long enough for the clock precision we are using. I'm having trouble dealing with that M times execution. Increasing M decreases (a lot) the final average value. Which doesn't make sense to me. It's like this, on average you take 3 to 5 seconds to travel from point A to B. But then you go from A to B and back to A 5 times (which makes it 10 times, cause A to B is the same as B to A) and you measure that. Than you divide by 10, the average you get is supposed to be the same average you take traveling from point A to B, which is 3 to 5 seconds. This is what I want my code to do, but it's not working. If I keep increasing the number of times I go from A to B and back A, the average will be lower and lower each time, it makes no sense to me. Enough theory, here's my code: #include <stdio.h> #include <time.h> #define MEASUREMENTS 1 #define OPERATIONS 1 typedef struct timespec TimeClock; TimeClock diffTimeClock(TimeClock start, TimeClock end) { TimeClock aux; if((end.tv_nsec - start.tv_nsec) < 0) { aux.tv_sec = end.tv_sec - start.tv_sec - 1; aux.tv_nsec = 1E9 + end.tv_nsec - start.tv_nsec; } else { aux.tv_sec = end.tv_sec - start.tv_sec; aux.tv_nsec = end.tv_nsec - start.tv_nsec; } return aux; } int main(void) { TimeClock sTime, eTime, dTime; int i, j; for(i = 0; i < MEASUREMENTS; i++) { printf(" » MEASURE %02d\n", i+1); clock_gettime(CLOCK_REALTIME, &sTime); for(j = 0; j < OPERATIONS; j++) { myOperation(); } clock_gettime(CLOCK_REALTIME, &eTime); dTime = diffTimeClock(sTime, eTime); printf(" - NSEC (TOTAL): %ld\n", dTime.tv_nsec); printf(" - NSEC (OP): %ld\n\n", dTime.tv_nsec / OPERATIONS); } return 0; } Notes: The above diffTimeClock function is from this blog post. I replaced my real operation with myOperation() because it doesn't make any sense to post my real functions as I would have to post long blocks of code, you can easily code a myOperation() with whatever you like to compile the code if you wish. As you can see, OPERATIONS = 1 and the results are: » MEASURE 01 - NSEC (TOTAL): 27456580 - NSEC (OP): 27456580 For OPERATIONS = 100 the results are: » MEASURE 01 - NSEC (TOTAL): 218929736 - NSEC (OP): 2189297 For OPERATIONS = 1000 the results are: » MEASURE 01 - NSEC (TOTAL): 862834890 - NSEC (OP): 862834 For OPERATIONS = 10000 the results are: » MEASURE 01 - NSEC (TOTAL): 574133641 - NSEC (OP): 57413 Now, I'm not a math wiz, far from it actually, but this doesn't make any sense to me whatsoever. I've already talked about this with a friend that's on this project with me and he also can't understand the differences. I don't understand why the value is getting lower and lower when I increase OPERATIONS. The operation itself should take the same time (on average of course, not the exact same time), no matter how many times I execute it. You could tell me that that actually depends on the operation itself, the data being read and that some data could already be in the cache and bla bla, but I don't think that's the problem. In my case, myOperation is reading 5000 lines of text from an CSV file, separating the values by ; and inserting those values into a data structure. For each iteration, I'm destroying the data structure and initializing it again. Now that I think of it, I also that think that there's a problem measuring time with clock_gettime(), maybe I'm not using it right. I mean, look at the last example, where OPERATIONS = 10000. The total time it took was 574133641ns, which would be roughly 0,5s; that's impossible, it took a couple of minutes as I couldn't stand looking at the screen waiting and went to eat something.

    Read the article

  • Show javascript execution progress

    - by Midhat
    I have some javascript functions that take about 1 to 3 seconds. (some loops or mooML templating code.) During this time, the browser is just frozen. I tried showing a "loading" animation (gif image) before starting the operation and hiding it afterwords. but it just doesnt work. The browser freezes before it could render the image and hides it immediately when the function ends. Is there anything I can do to tell the browser to update the screen before going into javascript execution., Something like Application.DoEvents or background worker threads. So any comments/suggestions about how to show javascript execution progress. My primary target browser is IE6, but should also work on all latest browsers

    Read the article

  • Resuming execution of code after exception is thrown and caught

    - by dotnetdev
    Hi, How is it possible to resume code execution after an exception is thrown? For exampel, take the following code: namespace ConsoleApplication1 { class Test { public void s() { throw new NotSupportedException(); string @class = "" ; Console.WriteLine(@class); Console.ReadLine(); } } class Program { static void Main(string[] args) { try { new Test().s(); } catch (ArgumentException x) { } catch (Exception ex) { } } } } After catching the exception when stepping through, the program will stop running. How can I still carry on execution? Thanks

    Read the article

  • Very Different Execution Times of SQL Query in C# and SQL Server Management Studio

    - by Paul
    I have a simple SQL query that when run from C# takes over 30 seconds then times-out every time, whereas when run on SQL Server Management Studio successfully completes instantly. In the latter case, a query execution plan reveals nothing troubling, and the execution time is spread nicely through a few simple operations. I've run 'EXEC sp_who2' while the query is running from C#, and it is listed as taking 29,000 milliseconds of CPU time, and is not blocked by anything. I have no idea how to begin solving this. Does anyone have some insight? The query is: SELECT c.lngId, ... FROM tblCase c INNER JOIN tblCaseStatus s ON s.lngId = c.lngId INNER JOIN tblCaseStatusType t ON t.lngId = s.lngId INNER JOIN [Another Database]..tblCompany cm ON cm.lngId = cs.lngCompanyId WHERE t.lngId = 25 AND c.IsDeleted = 0 AND s.lngStatus = 1

    Read the article

  • oracle display for every stored procedure the execution time

    - by CC
    Hi all. I'm working on a stored procedure. Inside this one, there are many call to the other stored procedures. There are a bunch of them. I was wondering if there is a option to be able to have the execution time of every stored procedure involved, every function (with a start and end time, ior something like that). The idea is that I need to optimise it and I should touch every part, and since I not sure where is the longest execution time, is a bit difficult. And after a modification I would like the see the hole process if it's shorter or not. If I call the procedure from unix, using sql plus, I have no log. If I call it from TOAD, it's blocked until the end. Any idea? I'm not a dba, so I don't have many rights on the database, I'm just a regular user. Thanks for any advice. C.C.

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >