Search Results

Search found 27581 results on 1104 pages for 'execute command'.

Page 237/1104 | < Previous Page | 233 234 235 236 237 238 239 240 241 242 243 244  | Next Page >

  • Maven assembly - Error reading assemblies

    - by Laurent
    Dear all, I have defined a personalized jar-with-dependencies assembly descriptor. However, when I execute it with mvn assembly:assembly, I get : ... [INFO] META-INF/ already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] javax/ already added, skipping [INFO] META-INF/ already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/maven/ already added, skipping [INFO] [assembly:assembly {execution: default-cli}] [INFO] ------------------------------------------------------------------------ [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] Error reading assemblies: No assembly descriptors found. My jar-with-dependencies.xml is in src/main/resources/assemblies/. My assembly descriptor is the following : <?xml version='1.0' encoding='UTF-8'?> <assembly> <id>jar-with-dependencies</id> <formats> <format>jar</format> </formats> <dependencySets> <dependencySet> <scope>runtime</scope> <unpack>true</unpack> <unpackOptions> <excludes> <exclude>**/LICENSE*</exclude> <exclude>**/README*</exclude> </excludes> </unpackOptions> </dependencySet> </dependencySets> <fileSets> <fileSet> <directory>${project.build.outputDirectory}</directory> <outputDirectory>/</outputDirectory> </fileSet> <fileSet> <directory>src/main/resources/META-INF/services</directory> <outputDirectory>META-INF/services</outputDirectory> </fileSet> </fileSets> </assembly> And my project pom.xml is : <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-assembly-plugin</artifactId> <version>2.2-beta-5</version> <executions> <execution> <id>jar-with-dependencies</id> <phase>package</phase> <goals> <goal>single</goal> </goals> <configuration> <descriptors> <descriptor>jar-with-dependencies.xml</descriptor> </descriptors> <archive> <manifest> <mainClass>org.my.app.HowTo</mainClass> </manifest> </archive> </configuration> </execution> </executions> </plugin> When mvn assembly:assembly is performed, dependencies are unpacked and I get the previous error when unpack has finished. Moreover, if I execute mvn -e assembly:assembly it is say that no descriptors has been found, however it try to unpack dependencies and a JAR with dependencies is created but it doesn't contain META-INF/services/* as specified in descriptor : [ERROR] BUILD ERROR [INFO] ------------------------------------------------------------------------ [INFO] Error reading assemblies: No assembly descriptors found. [INFO] ------------------------------------------------------------------------ [INFO] Trace org.apache.maven.lifecycle.LifecycleExecutionException: Error reading assemblies: No assembly descriptors found. at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:719) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeStandaloneGoal(DefaultLifecycleExecutor.java:569) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(DefaultLifecycleExecutor.java:539) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHandleFailures(DefaultLifecycleExecutor.java:387) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegments(DefaultLifecycleExecutor.java:284) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLifecycleExecutor.java:180) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:328) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:138) at org.apache.maven.cli.MavenCli.main(MavenCli.java:362) at org.apache.maven.cli.compat.CompatibleMain.main(CompatibleMain.java:60) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315) at org.codehaus.classworlds.Launcher.launch(Launcher.java:255) at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430) at org.codehaus.classworlds.Launcher.main(Launcher.java:375) Caused by: org.apache.maven.plugin.MojoExecutionException: Error reading assemblies: No assembly descriptors found. at org.apache.maven.plugin.assembly.mojos.AbstractAssemblyMojo.execute(AbstractAssemblyMojo.java:356) at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPluginManager.java:490) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:694) ... 17 more Caused by: org.apache.maven.plugin.assembly.io.AssemblyReadException: No assembly descriptors found. at org.apache.maven.plugin.assembly.io.DefaultAssemblyReader.readAssemblies(DefaultAssemblyReader.java:206) at org.apache.maven.plugin.assembly.mojos.AbstractAssemblyMojo.execute(AbstractAssemblyMojo.java:352) ... 19 more I don't see my error. Does someone has a solution ? Kind Regards Laurent

    Read the article

  • Linked servers SQLNCLI problem. "No transaction is active"

    - by Felipe Fiali
    Im trying to execute a stored procedure and simply insert its results in a temporary table, and I'm getting the following message: The operation could not be performed because OLE DB provider "SQLNCLI" for linked server "MyServerName" was unable to begin a distributed transaction. OLE DB provider "SQLNCLI" for linked server "MyServerName" returned message "No transaction is active.". My query looks like this: INSERT INTO #TABLE EXEC MyServerName.MyDatabase.dbo.MyStoredProcedure Param1, Param2, Param3 Exact column number, names, the problem is not the result. MSDTC is allowed and started in both computers, Remote procedure calling too. The machines are not in the same domain, but I can execute remote queries from my machine and get the result. I can even execute the stored procedure and see its results, I just can't insert it in another table. Help, please? :)

    Read the article

  • xpath query in a servlet gives exception

    - by user1401071
    I have a Document object initialized in the init() method of the servlet and use it in the doPost() method to service the requests. selectNodeList() xpath query gives exception when the servlet services many request at same time. The Exception is shown below: Caused by: javax.xml.transform.TransformerException: -1 at org.apache.xpath.XPath.execute(XPath.java:331) at org.apache.xpath.CachedXPathAPI.eval(CachedXPathAPI.java:328) at org.apache.xpath.CachedXPathAPI.selectNodeList(CachedXPathAPI.java:255) at org.apache.xpath.CachedXPathAPI.selectNodeList(CachedXPathAPI.java:235) at com.pro.bb.servlets.Controller.getDataOrPeriodForReport(Controller.java:511) ... 23 more Caused by: java.lang.ArrayIndexOutOfBoundsException: -1 at org.apache.xpath.XPathContext.pushCurrentNode(XPathContext.java:808) at org.apache.xpath.axes.PredicatedNodeTest.acceptNode(PredicatedNodeTest.java:447) at org.apache.xpath.axes.AxesWalker.nextNode(AxesWalker.java:409) at org.apache.xpath.axes.WalkingIterator.nextNode(WalkingIterator.java:176) at org.apache.xpath.axes.NodeSequence.nextNode(NodeSequence.java:320) at org.apache.xpath.axes.NodeSequence.runTo(NodeSequence.java:474) at org.apache.xpath.axes.NodeSequence.setRoot(NodeSequence.java:257) at org.apache.xpath.axes.LocPathIterator.execute(LocPathIterator.java:257) at org.apache.xpath.XPath.execute(XPath.java:308) Help me sort out the issue.

    Read the article

  • How can resolve Error: "Could not load type 'Microsoft.SharePoint.WebControls.SPGridView" SharePoin

    - by ricky roy
    Following error comes when creating a WebPart In sharePoint 2010 Server. Web Part Error: Unhandled exception was thrown by the user code wrapper's Execute method in the partial trust app domain: System.Web.HttpUnhandledException: Exception of type 'System.Web.HttpUnhandledException' was thrown. --- System.TypeLoadException: Could not load type 'Microsoft.SharePoint.WebControls.SPGridView' from assembly 'Microsoft.SharePoint, Version=14.900.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c'. at ListMenuSample.ListMenuSample.ListMenuSample.CreateChildControls() at System.Web.UI.Control.EnsureChildControls() at System.Web.UI.Control.PreRenderRecursiveInternal() at System.Web.UI.Control.PreRenderRecursiveInternal() at System.Web.UI.Control.PreRenderRecursiveInternal() at System.Web.UI.Control.PreRenderRecursiveInternal() at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) --- End of inner exception stack trace --- at System.Web.UI.Page.HandleError(Exception e) at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) at System.Web.UI.Page.ProcessRequest(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) at System.Web.UI.Page.ProcessRequest() at System.Web.UI.Page.ProcessRequest(HttpContext context) at Microsoft.SharePoint.UserCode.SPUserCodeWebPartWrapper.ExecuteHttpRequest(SPUserCodeWebPartHttpRequestContext webPartExecutionContext, SPUserCodeWebPartHttpResponse httpRequestResponse) at Microsoft.SharePoint.UserCode.SPUserCodeWebPartWrapper.Execute(SPUserCodeExecutionContext executionContext) at Microsoft.SharePoint.UserCode.SPUserCodeApplicationHostAppDomainRef.Execute(Type userCodeWrapperType, SPUserCodeCachedAssemblyGroup userAssemblyGroup, Guid siteCollectionId, Byte[] binaryUserCodeToken, Byte[] proxyOperationToken, SPUserCodeExecutionContext executionContext) Thanks & Regards, Basant

    Read the article

  • Generating new sources via Maven plugin after compile phase

    - by japher
    I have a Maven project within which I need execute two code generation steps. One generates some Java types, then the second depends on those Java types to generate some more code. Is there a way to have both of these steps happening during my build? At the moment my steps are: execute first code generation plugin (during generate-sources) add directory of generated types to build path execute second code generation plugin (during compile) However my problem is that anything generated by the second code generation plugin will not be compiled (because the compile phase has finished). If I attach the second code generation plugin to an earlier phase, it fails because it needs the classes from the first code generation plugin to be present on the classpath. I know I could split this into two modules with one dependent on the other, but I was wondering if this could be achieved in one pom. It seems like a need a way to invoke compile again after the normal compile phase is complete. Any ideas?

    Read the article

  • Java - Difference between SwingWorker and SwingUtilities.invokeLater()

    - by Yatendra Goel
    SwingWorker is used for the following purposes: For running long-running tasks in a different thread so as to prevent the GUI from being unresponsive For updating GUI with the results produced by the long-running task at the end of the task through done() method. For updating GUI from time to time with the intermediate results produced and published by the task with the help of publish() and process() methods. SwingUtilities.invokeLater() can perform the above tasks as follows: Instead of executing SwingWorker.execute() method from the EDT, we can execute ExecutorService.submit(new MyRunnable()) as it will also create another thread which can execute long-running task. For updating GUI at the end of the task, we can put code (written in done() method of case1) SwingUtilites.invokeLater(new RunnableToExecuteDoneMethodCode()) at the end of the task. For updating GUI in the middle of the task, we can put code (written in process() method of case1) SwingUtilites.invokeLater(new RunnableToExecuteProcessMethodCode()) at the place where we called publish() method in case1. I am asking this question because the problem specified in question http://stackoverflow.com/questions/2797483/java-swingworker-can-we-call-one-swingworker-from-other-swingworker-instead-o/2824306#2824306 can be solved by SwingUtilities.invokeLater() but can't be solved with SwingWorker

    Read the article

  • oracle global temporary tables

    - by mrp
    I created the global temp table. when I execute the code as an individual scripts it works fine. but when I execute it as a single script in TOAD then no record was created. there was just an empty global temp table. eg. CREATE GLOBAL TEMPORARY TABLE TEMP_TRAN ( COL1 NUMBER(9), COL2 VARCHAR2(30), COL3 DATE ) ON COMMIT PRESERVE ROWS / INSERT INTO TEMP_TRAN VALUES(1,'D',sysdate); / INSERT INTO TEMP_TRAN VALUES(2,'I',sysdate); / INSERT INTO TEMP_TRAN VALUES(3,'s',sysdate); / COMMIT; When I run the above code one statement at a time it works fine. But when I execute it as a script it runs fine but there was no records in temp table. can anyone help me on this please?

    Read the article

  • ASM programming, how to use loop?

    - by chris
    Hello. Im first time here.I am a college student. I've created a simple program by using assembly language. And im wondering if i can use loop method to run it almost samething as what it does below the program i posted. and im also eager to find someome who i can talk through MSN messanger so i can ask you questions right away.(if possible) ok thank you .MODEL small .STACK 400h .data prompt db 10,13,'Please enter a 3 digit number, example 100:',10,13,'$' ;10,13 cause to go to next line first_digit db 0d second_digit db 0d third_digit db 0d Not_prime db 10,13,'This number is not prime!',10,13,'$' prime db 10,13,'This number is prime!',10,13,'$' question db 10,13,'Do you want to contine Y/N $' counter dw 0d number dw 0d half dw ? .code Start: mov ax, @data ;establish access to the data segment mov ds, ax mov number, 0d LetsRoll: mov dx, offset prompt ; print the string (please enter a 3 digit...) mov ah, 9h int 21h ;execute ;read FIRST DIGIT mov ah, 1d ;bios code for read a keystroke int 21h ;call bios, it is understood that the ascii code will be returned in al mov first_digit, al ;may as well save a copy sub al, 30h ;Convert code to an actual integer cbw ;CONVERT BYTE TO WORD. This takes whatever number is in al and ;extends it to ax, doubling its size from 8 bits to 16 bits ;The first digit now occupies all of ax as an integer mov cx, 100d ;This is so we can calculate 100*1st digit +10*2nd digit + 3rd digit mul cx ;start to accumulate the 3 digit number in the variable imul cx ;it is understood that the other operand is ax ;AND that the result will use both dx::ax ;but we understand that dx will contain only leading zeros add number, ax ;save ;variable <number> now contains 1st digit * 10 ;---------------------------------------------------------------------- ;read SECOND DIGIT, multiply by 10 and add in mov ah, 1d ;bios code for read a keystroke int 21h ;call bios, it is understood that the ascii code will be returned in al mov second_digit, al ;may as well save a copy sub al, 30h ;Convert code to an actual integer cbw ;CONVERT BYTE TO WORD. This takes whatever number is in al and ;extends it to ax, boubling its size from 8 bits to 16 bits ;The first digit now occupies all of ax as an integer mov cx, 10d ;continue to accumulate the 3 digit number in the variable mul cx ;it is understood that the other operand is ax, containing first digit ;AND that the result will use both dx::ax ;but we understand that dx will contain only leading zeros. Ignore them add number, ax ;save -- nearly finished ;variable <number> now contains 1st digit * 100 + second digit * 10 ;---------------------------------------------------------------------- ;read THIRD DIGIT, add it in (no multiplication this time) mov ah, 1d ;bios code for read a keystroke int 21h ;call bios, it is understood that the ascii code will be returned in al mov third_digit, al ;may as well save a copy sub al, 30h ;Convert code to an actual integer cbw ;CONVERT BYTE TO WORD. This takes whatever number is in al and ;extends it to ax, boubling its size from 8 bits to 16 bits ;The first digit now occupies all of ax as an integer add number, ax ;Both my variable number and ax are 16 bits, so equal size mov ax, number ;copy contents of number to ax mov cx, 2h div cx ;Divide by cx mov half, ax ;copy the contents of ax to half mov cx, 2h; mov ax, number; ;copy numbers to ax xor dx, dx ;flush dx jmp prime_check ;jump to prime check print_question: mov dx, offset question ;print string (do you want to continue Y/N?) mov ah, 9h int 21h ;execute mov ah, 1h int 21h ;execute cmp al, 4eh ;compare je Exit ;jump to exit cmp al, 6eh ;compare je Exit ;jump to exit cmp al, 59h ;compare je Start ;jump to start cmp al, 79h ;compare je Start ;jump to start prime_check: div cx; ;Divide by cx cmp dx, 0h ;reset the value of dx je print_not_prime ;jump to not prime xor dx, dx; ;flush dx mov ax, number ;copy the contents of number to ax cmp cx, half ;compare half with cx je print_prime ;jump to print prime section inc cx; ;increment cx by one jmp prime_check ;repeat the prime check print_prime: mov dx, offset prime ;print string (this number is prime!) mov ah, 9h int 21h ;execute jmp print_question ;jumps to question (do you want to continue Y/N?) this is for repeat print_not_prime: mov dx, offset Not_prime ;print string (this number is not prime!) mov ah, 9h int 21h ;execute jmp print_question ;jumps to question (do you want to continue Y/N?) this is for repeat Exit: mov ah, 4ch int 21h ;execute exit END Start

    Read the article

  • Invoke a COM addin option from VBA

    - by rip
    Can I invoke an option on a COM Add-in from a VBA macro in Word or Excel 2007? The COM Add-in was written using VSTO – it adds a custom ribbon tab with a number of options that I want to execute from a VBA macro. I can reference the add-in using Application.COMAddIns("MyAddinName") but I can’t find an option to invoke an option. I’ve also played around with the Application.CommandBars collection, and can see that you can execute an option using CommandBarControl.Execute but I can’t find my command bar in the Application.CommandBars collection. Does anyone know if this is possible?

    Read the article

  • Python SQLite: database is locked

    - by user322683
    I'm trying this code: import sqlite connection = sqlite.connect('cache.db') cur = connection.cursor() cur.execute('''create table item (id integer primary key, itemno text unique, scancode text, descr text, price real)''') connection.commit() cur.close() I'm catching this exception: Traceback (most recent call last): File "cache_storage.py", line 7, in <module> scancode text, descr text, price real)''') File "/usr/lib/python2.6/dist-packages/sqlite/main.py", line 237, in execute self.con._begin() File "/usr/lib/python2.6/dist-packages/sqlite/main.py", line 503, in _begin self.db.execute("BEGIN") _sqlite.OperationalError: database is locked Permissions for cache.db are ok. Any ideas?

    Read the article

  • How to convert from K&R C to ANSI C?

    - by Vadakkumpadath
    I am trying to execute following code which is the 1988 entry of Obfuscated C Code Contest. #define _ -F<00||--F-OO--; int F=00,OO=00;main(){F_OO();printf("%1.3f\n",4.*-F/OO/OO);}F_OO() { _-_-_-_ _-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_-_-_-_-_ _-_-_-_-_-_-_-_ _-_-_-_ } From entry description, this code is calculating pi by looking at its own area. I successfully compiled it without changing the code. But when I executed, it is giving me a value 0.25, what I am expecting is 3.14. Code description says it is in K&R C and it doesn't work correctly in ANSI C without some change. I think I have to do those modification to execute it properly. I don't have any previous experience with K&R C. So can someone help me to change above code to ANSI C or point to the problems if any. I am using Microsoft Visual Studio 2008 to execute this.

    Read the article

  • pyODBC and Unicode Problem

    - by Aviv Giladi
    Hey guys, I'm working with pyODBC communicate with a MS SQL 2005 Express server. The table to which i'm trying to save the data consists of nvarchar columns. query = u"INSERT INTO tblPersons (name, birthday, gender) VALUES('" query = query + name + u"', '" query = query + birthday + u"', '" query = query + gender + u"')" cur.execute(query ) The variables name, birthrday and gende are read from an Excel file and they are Unicode strings. When I execute the query and either look at the table with SQL Server Management Studio or execute a query that fetches the data that was just inserted, all the data that was written in a non-English languages turn into question marks. The data that was written in English is preserved and appears in the table in the correct way. I tried adding CHARSET=UTF16 to my connection string, but had no luck with that. I can use UTF-8 which works fine but as a working convention, I need all the data saved in my DB to be UTF16. Thanks!

    Read the article

  • DOS Batch script loop

    - by Tom J Nowell
    I need to execute a command 100-200 times, so far my research indicates that I would either have to copy paste 100 copies of this command, OR use a FOR loop, but the for loop expects a list of items, hence I would need 200 files to operate on, or a list of 200 items, hence defeating the point. I would really not have to write a C program and go through the length of documenting why I had to write another program to execute my program for test purposes. Modification of my program itself is also no an option. So, given a command how would I execute it times via a DOS batch script?

    Read the article

  • n++ vs n=n+1. Which one is faster

    - by piemesons
    Somebody asked me Is n++ faster than n=n+1? My answer:-- ++ is a unary operator in C which(n++) takes only one machine instruction to execute while n=n+1 takes more than one machine instructions to execute. Anyone correct me if I am wrong, but in Assembler it take something like this: n++: inc n n = n + 1; mov ax n add ax 1 mov n ax its not exactli this, but it's near it.but in most cases a good compiler will change n = n + 1 to ++n.So A good compiler will generate same code for both and hence the same time to execute.

    Read the article

  • nhibernate fatal error

    - by Afif Lamloumi
    i have an error ( System.InvalidCastException: Unable to cast object of type 'AccountProxy' to type 'System.String'.) when i did this code i mapped the tables( Account,AccountString,EventData,...) of the the database opengts ( open source) i have this error when i called a function from EventData.cs IQuery query = session.CreateQuery("FROM Eventdata"); IList pets = query.List(); return pets; the Stack Trace: [InvalidCastException: Impossible d'effectuer un cast d'un objet de type 'AccountProxy' en type 'System.String'.] (Object , Object[] , SetterCallback ) +431 NHibernate.Bytecode.Lightweight.AccessOptimizer.SetPropertyValues(Object target, Object[] values) +20 NHibernate.Tuple.Component.PocoComponentTuplizer.SetPropertyValues(Object component, Object[] values) +49 NHibernate.Type.ComponentType.SetPropertyValues(Object component, Object[] values, EntityMode entityMode) +34 NHibernate.Type.ComponentType.ResolveIdentifier(Object value, ISessionImplementor session, Object owner) +150 NHibernate.Type.ComponentType.NullSafeGet(IDataReader rs, String[] names, ISessionImplementor session, Object owner) +42 NHibernate.Loader.Loader.GetKeyFromResultSet(Int32 i, IEntityPersister persister, Object id, IDataReader rs, ISessionImplementor session) +93 NHibernate.Loader.Loader.GetRowFromResultSet(IDataReader resultSet, ISessionImplementor session, QueryParameters queryParameters, LockMode[] lockModeArray, EntityKey optionalObjectKey, IList hydratedObjects, EntityKey[] keys, Boolean returnProxies) +92 NHibernate.Loader.Loader.DoQuery(ISessionImplementor session, QueryParameters queryParameters, Boolean returnProxies) +675 NHibernate.Loader.Loader.DoQueryAndInitializeNonLazyCollections(ISessionImplementor session, QueryParameters queryParameters, Boolean returnProxies) +129 NHibernate.Loader.Loader.DoList(ISessionImplementor session, QueryParameters queryParameters) +116 [GenericADOException: could not execute query [ select eventdata0_.deviceID as deviceID5_, eventdata0_.timestamp as timestamp5_, eventdata0_.statusCode as statusCode5_, eventdata0_.accountID as accountID5_, eventdata0_.latitude as latitude5_, eventdata0_.longitude as longitude5_, eventdata0_.gpsAge as gpsAge5_, eventdata0_.speedKPH as speedKPH5_, eventdata0_.heading as heading5_, eventdata0_.altitude as altitude5_, eventdata0_.transportID as transpo11_5_, eventdata0_.inputMask as inputMask5_, eventdata0_.outputMask as outputMask5_, eventdata0_.address as address5_, eventdata0_.DataSource as DataSource5_, eventdata0_.rawdata as rawdata5_, eventdata0_.distanceKM as distanceKM5_, eventdata0_.odometerKM as odometerKM5_, eventdata0_.geozoneIndex as geozone19_5_, eventdata0_.geozoneID as geozoneID5_, eventdata0_.creationTime as creatio21_5_ from eventdata eventdata0_ ] [SQL: select eventdata0_.deviceID as deviceID5_, eventdata0_.timestamp as timestamp5_, eventdata0_.statusCode as statusCode5_, eventdata0_.accountID as accountID5_, eventdata0_.latitude as latitude5_, eventdata0_.longitude as longitude5_, eventdata0_.gpsAge as gpsAge5_, eventdata0_.speedKPH as speedKPH5_, eventdata0_.heading as heading5_, eventdata0_.altitude as altitude5_, eventdata0_.transportID as transpo11_5_, eventdata0_.inputMask as inputMask5_, eventdata0_.outputMask as outputMask5_, eventdata0_.address as address5_, eventdata0_.DataSource as DataSource5_, eventdata0_.rawdata as rawdata5_, eventdata0_.distanceKM as distanceKM5_, eventdata0_.odometerKM as odometerKM5_, eventdata0_.geozoneIndex as geozone19_5_, eventdata0_.geozoneID as geozoneID5_, eventdata0_.creationTime as creatio21_5_ from eventdata eventdata0_]] NHibernate.Loader.Loader.DoList(ISessionImplementor session, QueryParameters queryParameters) +213 NHibernate.Loader.Loader.ListIgnoreQueryCache(ISessionImplementor session, QueryParameters queryParameters) +18 NHibernate.Loader.Loader.List(ISessionImplementor session, QueryParameters queryParameters, ISet`1 querySpaces, IType[] resultTypes) +79 NHibernate.Hql.Ast.ANTLR.Loader.QueryLoader.List(ISessionImplementor session, QueryParameters queryParameters) +51 NHibernate.Hql.Ast.ANTLR.QueryTranslatorImpl.List(ISessionImplementor session, QueryParameters queryParameters) +231 NHibernate.Engine.Query.HQLQueryPlan.PerformList(QueryParameters queryParameters, ISessionImplementor session, IList results) +369 NHibernate.Impl.SessionImpl.List(String query, QueryParameters queryParameters, IList results) +317 NHibernate.Impl.SessionImpl.List(String query, QueryParameters parameters) +282 NHibernate.Impl.QueryImpl.List() +163 DATA1.EventdataExtensions.GetEventdata() in C:\Users\HP\Desktop\our_project\DATA1\Queries\Eventdata.cs:33 MvcApplication7.Controllers.HistoriqueController.Index() in C:\Users\HP\Desktop\our_project\MvcApplication7\Controllers\HistoriqueController.cs:17 lambda_method(Closure , ControllerBase , Object[] ) +62 System.Web.Mvc.ActionMethodDispatcher.Execute(ControllerBase controller, Object[] parameters) +17 System.Web.Mvc.ReflectedActionDescriptor.Execute(ControllerContext controllerContext, IDictionary`2 parameters) +208 System.Web.Mvc.ControllerActionInvoker.InvokeActionMethod(ControllerContext controllerContext, ActionDescriptor actionDescriptor, IDictionary`2 parameters) +27 System.Web.Mvc.<>c__DisplayClass15.<InvokeActionMethodWithFilters>b__12() +55 System.Web.Mvc.ControllerActionInvoker.InvokeActionMethodFilter(IActionFilter filter, ActionExecutingContext preContext, Func`1 continuation) +263 System.Web.Mvc.<>c__DisplayClass17.<InvokeActionMethodWithFilters>b__14() +19 System.Web.Mvc.ControllerActionInvoker.InvokeActionMethodWithFilters(ControllerContext controllerContext, IList`1 filters, ActionDescriptor actionDescriptor, IDictionary`2 parameters) +191 System.Web.Mvc.ControllerActionInvoker.InvokeAction(ControllerContext controllerContext, String actionName) +343 System.Web.Mvc.Controller.ExecuteCore() +116 System.Web.Mvc.ControllerBase.Execute(RequestContext requestContext) +97 System.Web.Mvc.ControllerBase.System.Web.Mvc.IController.Execute(RequestContext requestContext) +10 System.Web.Mvc.<>c__DisplayClassb.<BeginProcessRequest>b__5() +37 System.Web.Mvc.Async.<>c__DisplayClass1.<MakeVoidDelegate>b__0() +21 System.Web.Mvc.Async.<>c__DisplayClass8`1.<BeginSynchronous>b__7(IAsyncResult _) +12 System.Web.Mvc.Async.WrappedAsyncResult`1.End() +62 System.Web.Mvc.<>c__DisplayClasse.<EndProcessRequest>b__d() +50 System.Web.Mvc.SecurityUtil.<GetCallInAppTrustThunk>b__0(Action f) +7 System.Web.Mvc.SecurityUtil.ProcessInApplicationTrust(Action action) +22 System.Web.Mvc.MvcHandler.EndProcessRequest(IAsyncResult asyncResult) +60 System.Web.Mvc.MvcHandler.System.Web.IHttpAsyncHandler.EndProcessRequest(IAsyncResult result) +9 System.Web.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() +8841105 System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) +184 Any suggestions? how can correct this error Data entity class (outtake from comment): public class MyClass { public virtual string DeviceID { get; set; } public virtual int Timestamp { get; set; } public virtual string Account { get; set; } public virtual int StatusCode { get; set; } public virtual double Latitude { get; set; } public virtual double Longitude { get; set; } public virtual int GpsAge { get; set; } public virtual double SpeedKPH { get; set; } public virtual double Heading { get; set; } public override bool Equals(object obj) { return true; } public override int GetHashCode() { return 0; } }

    Read the article

  • Bulk inserting best way to about it? + Helping me understand fully what I found so far

    - by chobo2
    Hi So I saw this post here and read it and it seems like bulk copy might be the way to go. http://stackoverflow.com/questions/682015/whats-the-best-way-to-bulk-database-inserts-from-c I still have some questions and want to know how things actually work. So I found 2 tutorials. http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx First way uses 2 ado.net 2.0 features. BulkInsert and BulkCopy. the second one uses linq to sql and OpenXML. This sort of appeals to me as I am using linq to sql already and prefer it over ado.net. However as one person pointed out in the posts what he just going around the issue at the cost of performance( nothing wrong with that in my opinion) First I will talk about the 2 ways in the first tutorial I am using VS2010 Express, .net 4.0, MVC 2.0, SQl Server 2005 Is ado.net 2.0 the most current version? Based on the technology I am using, is there some updates to what I am going to show that would improve it somehow? Is there any thing that these tutorial left out that I should know about? BulkInsert I am using this table for all the examples. CREATE TABLE [dbo].[TBL_TEST_TEST] ( ID INT IDENTITY(1,1) PRIMARY KEY, [NAME] [varchar](50) ) SP Code USE [Test] GO /****** Object: StoredProcedure [dbo].[sp_BatchInsert] Script Date: 05/19/2010 15:12:47 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[sp_BatchInsert] (@Name VARCHAR(50) ) AS BEGIN INSERT INTO TBL_TEST_TEST VALUES (@Name); END C# Code /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 1000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } So first thing is the batch size. Why would you set a batch size to anything but the number of records you are sending? Like I am sending 500,000 records so I did a Batch size of 500,000. Next why does it crash when I do this? If I set it to 1000 for batch size it works just fine. System.Data.SqlClient.SqlException was unhandled Message="A transport-level error has occurred when sending the request to the server. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)" Source=".Net SqlClient Data Provider" ErrorCode=-2146232060 Class=20 LineNumber=0 Number=233 Server="" State=0 StackTrace: at System.Data.Common.DbDataAdapter.UpdatedRowStatusErrors(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.UpdatedRowStatus(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.Update(DataRow[] dataRows, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.UpdateFromDataTable(DataTable dataTable, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.Update(DataTable dataTable) at TestIQueryable.Program.BatchInsert() in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 124 at TestIQueryable.Program.Main(String[] args) in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 16 InnerException: Time it took to insert 500,000 records with insert batch size of 1000 took "2 mins and 54 seconds" Of course this is no official time I sat there with a stop watch( I am sure there are better ways but was too lazy to look what they where) So I find that kinda slow compared to all my other ones(expect the linq to sql insert one) and I am not really sure why. Next I looked at bulkcopy /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } This one seemed to go really fast and did not even need a SP( can you use SP with bulk copy? If you can would it be better?) BatchCopy had no problem with a 500,000 batch size.So again why make it smaller then the number of records you want to send? I found that with BatchCopy and 500,000 batch size it took only 5 seconds to complete. I then tried with a batch size of 1,000 and it only took 8 seconds. So much faster then the bulkinsert one above. Now I tried the other tutorial. USE [Test] GO /****** Object: StoredProcedure [dbo].[spTEST_InsertXMLTEST_TEST] Script Date: 05/19/2010 15:39:03 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[spTEST_InsertXMLTEST_TEST](@UpdatedProdData nText) AS DECLARE @hDoc int exec sp_xml_preparedocument @hDoc OUTPUT,@UpdatedProdData INSERT INTO TBL_TEST_TEST(NAME) SELECT XMLProdTable.NAME FROM OPENXML(@hDoc, 'ArrayOfTBL_TEST_TEST/TBL_TEST_TEST', 2) WITH ( ID Int, NAME varchar(100) ) XMLProdTable EXEC sp_xml_removedocument @hDoc C# code. /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } So I like this because I get to use objects even though it is kinda redundant. I don't get how the SP works. Like I don't get the whole thing. I don't know if OPENXML has some batch insert under the hood but I do not even know how to take this example SP and change it to fit my tables since like I said I don't know what is going on. I also don't know what would happen if the object you have more tables in it. Like say I have a ProductName table what has a relationship to a Product table or something like that. In linq to sql you could get the product name object and make changes to the Product table in that same object. So I am not sure how to take that into account. I am not sure if I would have to do separate inserts or what. The time was pretty good for 500,000 records it took 52 seconds The last way of course was just using linq to do it all and it was pretty bad. /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } I did only 50,000 records and that took over a minute to do. So I really narrowed it done to the linq to sql bulk insert way or bulk copy. I am just not sure how to do it when you have relationship for either way. I am not sure how they both stand up when doing updates instead of inserts as I have not gotten around to try it yet. I don't think I will ever need to insert/update more than 50,000 records at one type but at the same time I know I will have to do validation on records before inserting so that will slow it down and that sort of makes linq to sql nicer as your got objects especially if your first parsing data from a xml file before you insert into the database. Full C# code using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Xml.Serialization; using System.Data; using System.Data.SqlClient; namespace TestIQueryable { class Program { private static string connectionString = ""; static void Main(string[] args) { BatchInsert(); Console.WriteLine("done"); } /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 500000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } private static DataTable GetDataTable() { // You First need a DataTable and have all the insert values in it DataTable dtInsertRows = new DataTable(); dtInsertRows.Columns.Add("NAME"); for (int i = 0; i < 500000; i++) { DataRow drInsertRow = dtInsertRows.NewRow(); string name = "Name : " + i; drInsertRow["NAME"] = name; dtInsertRows.Rows.Add(drInsertRow); } return dtInsertRows; } static void sbc_SqlRowsCopied(object sender, SqlRowsCopiedEventArgs e) { Console.WriteLine("Number of records affected : " + e.RowsCopied.ToString()); } } }

    Read the article

  • How to use correctly the Query Window in SQL Server 2008

    - by Richard77
    Hello, What should I do to avoid that commands be executed each time I hit 'Execute !. icon' I mean this USE master; GO CREATE DATABASE Sales GO USE Sales; GO CREATE TABLE Customers( CustomerID int NOT NULL, LName varchar (50) NOT NULL, FName varchar (50) NULL, Status varchar (10), ModifiedBy varchar (30) NULL ) GO When I click Execute!, Sql Server tries to redo the same thing. What I do for now is to delete the Query Window completely then write what I need before clicking the Execute icon. But, I doubt that I should be doing that. What can I do to keep writing the commands without having each time to clear the Query Window? Thanks for helping

    Read the article

  • Scala and HttpClient: How do I resolve this error?

    - by Benjamin Metz
    I'm using scala with Apache HttpClient, and working through examples. I'm getting the following error: /Users/benjaminmetz/IdeaProjects/JakartaCapOne/src/JakExamp.scala Error:Error:line (16)error: overloaded method value execute with alternatives (org.apache.http.HttpHost,org.apache.http.HttpRequest)org.apache.http.HttpResponse <and> (org.apache.http.client.methods.HttpUriRequest,org.apache.http.protocol.HttpContext)org.apache.http.HttpResponse cannot be applied to (org.apache.http.client.methods.HttpGet,org.apache.http.client.ResponseHandler[String]) val responseBody = httpclient.execute(httpget, responseHandler) Here is the code with the error and line in question highlighted: import org.apache.http.client.ResponseHandler import org.apache.http.client.HttpClient import org.apache.http.client.methods.HttpGet import org.apache.http.impl.client.BasicResponseHandler import org.apache.http.impl.client.DefaultHttpClient object JakExamp { def main(args : Array[String]) : Unit = { val httpclient: HttpClient = new DefaultHttpClient val httpget: HttpGet = new HttpGet("www.google.com") println("executing request..." + httpget.getURI) val responseHandler: ResponseHandler[String] = new BasicResponseHandler val responseBody = httpclient.execute(httpget, responseHandler) // ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ println(responseBody) client.getConnectionManager.shutdown } } I can successfully run the example in java...

    Read the article

  • SSIS - Parallel Execution of Tasks - How efficient is it?

    - by Randy Minder
    I am building an SSIS package that will contain dozens of Sequence tasks. Each Sequence task will contain three tasks. One to truncate a destination table and remove indexes on the table, another to import data from a source table, and a third to add back indexes to the destination table. My question is this. I currently have nine of these Sequences tasks built, and none are dependent on any of the others. When I execute the package, SSIS seems to do a pretty good job of determining which tasks in which Sequence to execute, which, by the way, appears to be quite random. As I continue adding more Sequences, should I attempt to be smarter about how SSIS should execute these Sequences, or is SSIS smart enough to do it itself? Thanks.

    Read the article

  • Does MOSS 2007 workflows support calling external mehods ?

    - by Mina Samy
    Hi all I have a custom sharepoint workflow that I need to call an external method defined in a local service it always throws an exception System.InvalidOperationException: Could not find service of type 'ListItemCheckService.IListItemCheck' through the currently configured services. Consider adding the service to ExternalDataExchangeService. at System.Workflow.Activities.CallExternalMethodActivity.Execute(ActivityExecutionContext executionContext) at System.Workflow.ComponentModel.ActivityExecutor`1.Execute(T activity, ActivityExecutionContext executionContext) at System.Workflow.ComponentModel.ActivityExecutor`1.Execute(Activity activity, ActivityExecutionContext executionContext) at System.Workflow.ComponentModel.ActivityExecutorOperation.Run(IWorkflowCoreRuntime workflowCoreRuntime) at System.Workflow.Runtime.Scheduler.Run() the question is does the sharepoint workflow system support calling external methods from a local service ? thanks

    Read the article

  • SYSDATE - 1 error on pl/sql function

    - by ayo
    Hi curtisk/all, I have an issue: when i issue this function below ti gives me the following error: select 'EXECUTE DBMS_LOGMNR.ADD_LOGFILE(LOGFILENAME =>'''||name||'''||,OPTIONS=>DBMS_LOGMNR.NEW);' from v\$archived_log where name is not null; select 'EXECUTE DBMS_LOGMNR.ADD_LOGFILE(LOGFILENAME =>'''||name||'''||,OPTIONS=>DBMS_LOGMNR.ADDFILE);' from v\$archived_log where name is not null; EXECUTE DBMS_LOGMNR.START_LOGMNR( STARTTIME => SYSDATE - 1, ENDTIME => SYSDATE, OPTIONS => DBMS_LOGMNR.DICT_FROM_ONLINE_CATALOG + DBMS_LOGMNR.CONTINUOUS_MINE + DBMS_LOGMNR.COMMITTED_DATA_ONLY + DBMS_LOGMNR.PRINT_PRETTY_SQL); Error: * ERROR at line 1: ORA-01291: missing logfile ORA-06512: at "SYS.DBMS_LOGMNR", line 58 ORA-06512: at line 1 But i have added all the archived logs for several days before and my sysdate is at today. Kindly help out on this issue. thanks. Reagrds Ayo

    Read the article

  • Query Returning value as 0

    - by NIMISH DESHPANDE
    I am trying to execute following PL/SQL script in SQL Developer. The loop should return count of nulls but somehow everytime it is returning 0. set serveroutput on DECLARE --v_count number; v_count_null number; BEGIN execute immediate 'select count(*) from SP_MOSAIX' into v_count; FOR i in (select column_name from all_tab_COLUMNS where table_name = 'SP_MOSAIX') LOOP select count(*) into v_count_null from SP_MOSAIX where i.column_name IS NULL ; dbms_output.put_line(v_count_null); END LOOP; END; So when I run this, following output is what i get: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 But if I manually execute the query subsituting column_name I get the result. select count(*) into v_count_null from SP_MOSAIX where i.column_name IS NULL; Can anybody help on this?

    Read the article

  • Why connection in Python's DB-API does not have "begin" operation?

    - by newtover
    Working with cursors in mysql-python I used to call "BEGIN;", "COMMIT;", and "ROLLBACK;" explicitly as follows: try: cursor.execute("BEGIN;") # some statements cursor.execute("COMMIT;") except: cursor.execute("ROLLBACK;") then, I found out that the underlying connection object has the corresponding methods: try: cursor.connection.begin() # some statements cursor.connection.commit() except: cursor.connection.rollback() Inspecting the DB-API PEP I found out that it does not mention the begin() method for the connection object, even for the extensions. Mysql-python, by the way, throws the Deprecation Warning, when you use the method. sqlite3.connection, for example, does not have the methd at all. And the question is why there is no such method in the PEP? Is the statement somehow optional, is it enough to invoke commit() instead?

    Read the article

  • PHP PDO close()?

    - by PHPLOVER
    Can someone tell me, when you for example update, insert, delete.. should you then close it like $stmt->close(); ? i checked php manual and don't understand what close() actually does. EXAMPLE: $stmt = $dbh->prepare("SELECT `user_email` FROM `users` WHERE `user_email` = ? LIMIT 1"); $stmt->execute(array($email)); $stmt->close(); Next part of my question is, if as an example i had multiple update queries in a transaction after every execute() for each query i am executing should i close them individually ? ... because it's a transaction not sure i need to use $stmt->close(); after each execute(); or just use one $stmt->close(); after all of them ? Thanks once again, phplover

    Read the article

  • Rails problem find by sql

    - by Totty
    I have this query and I have an error: images = Image.find_by_sql('PREPARE stmt FROM \' SELECT * FROM images AS i WHERE i.on_id = 1 AND i.on_type = "profile" ORDER BY i.updated_at LIMIT ?, 6\ '; SET @lower_limit := ((5 DIV 6) * 6); EXECUTE stmt USING @lower_limit;') Mysql::Error: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'SET @lower_limit := ((5 DIV 6) * 6); EXECUTE stmt USING @lower_limit' at line 1: PREPARE stmt FROM ' SELECT * FROM images AS i WHERE i.on_id = 1 AND i.on_type = "profile" ORDER BY i.updated_at LIMIT ?, 6'; SET @lower_limit := ((5 DIV 6) * 6); EXECUTE stmt USING @lower_limit;

    Read the article

< Previous Page | 233 234 235 236 237 238 239 240 241 242 243 244  | Next Page >