Search Results

Search found 27905 results on 1117 pages for 'sql authority'.

Page 648/1117 | < Previous Page | 644 645 646 647 648 649 650 651 652 653 654 655  | Next Page >

  • Procedure Maximum stored procedure, function, trigger, or view nesting level exceeded (limit 32).

    - by Nick
    The stored proc is failing at below location,Thanks, for all your help. --Insert MSOrg Information DECLARE @PersonnelNumber int, @MSOrg varchar(255) DECLARE csr CURSOR FAST_FORWARD FOR SELECT PersonnelNumber FROM Person OPEN csr FETCH NEXT FROM csr INTO @PersonnelNumber WHILE @@FETCH_STATUS = 0 BEGIN EXEC GetMSOrg @PersonnelNumber, @MSOrg out INSERT INTO PersonSubject ( PersonnelNumber ,SubjectID ,SubjectValue ,Created ,Updated ) SELECT @PersonnelNumber ,SubjectID ,@MSOrg ,getDate() ,getDate() FROM Subject WHERE DisplayName = 'MS Org' FETCH NEXT FROM csr INTO @PersonnelNumber END CLOSE csr DEALLOCATE csr Below is the stored prc defination GetMSOrg and fails at third condition CREATE PROCEDURE [dbo].[GetMSOrg] ( @PersonnelNumber int ,@OrgTerm varchar(200) out ) AS DECLARE @MDRTermID int ,@ReportsToPersonnelNbr int --Check to see if we have reached the top of the chart SELECT @ReportsToPersonnelNbr = ReportsToPersonnelNbr FROM ReportsTo WHERE PersonnelNumber = @PersonnelNumber IF (@ReportsToPersonnelNbr IS NULL) --Reached the Top of the Org Ladder BEGIN SET @OrgTerm = 'Non-standard rollup' END ELSE IF (@PersonnelNumber IN (SELECT PersonnelNumber FROM OrgTermMap)) BEGIN SELECT @OrgTerm = s.Term FROM OrgTermMap tm JOIN Taxonomy..StaticHierarchy s ON tm.OrgTermID = s.TermID WHERE tm.PersonnelNumber = @PersonnelNumber END ELSE BEGIN SELECT @MDRTermID = tm.OrgTermID FROM ReportsTo r JOIN OrgTermMap tm ON r.ReportsToPersonnelNbr = tm.PersonnelNumber WHERE r.PersonnelNumber = @PersonnelNumber IF (@MDRTermID IS NULL) BEGIN EXEC GetMSOrg @ReportsToPersonnelNbr, @OrgTerm out END ELSE BEGIN SELECT @OrgTerm = Term FROM Taxonomy..StaticHierarchy WHERE VocabID = 118 AND TermID = @MDRTermID END END GO

    Read the article

  • SQLException: incorrect syntax near '2'.

    - by Tobechukwu Ezenachukwu
    whenever I call the "ExecuteNonQuery" command on the following CommandText, I get the above SQLException myCommand.CommandText = "INSERT INTO fixtures (round_id, matchcode, date_utc, time_utc, date_london, time_london, team_A_id, team_A, team_A_country, team_B_id, team_B, team_B_country, status, gameweek, winner, fs_A, fs_B, hts_A, hts_B, ets_A, ets_B, ps_A, ps_B, last_updated) VALUES (" _ & round_id & "," & match_id & "," & date_utc & ",'" & time_utc & "'," & date_london & ",'" & time_london & "'," & team_A_id & ",'" & team_A_name & "','" & team_A_country & "'," & team_B_id & ",'" & team_B_name & "','" & _ team_B_country & "','" & status & "'," & gameweek & ",'" & winner & "'," & fs_A & "," & fs_B & "," & hts_A & "," & hts_B & "," & ets_A & "," & ets_B & "," & ps_A & "," & ps_B & "," & last_updated & ")" But whenever, i remove the last table item - "last_updated", the error disappears. Please help me resolve this issue. Is there any special treatment to be given to datetime fields??? Thanks for your help

    Read the article

  • ORACLE: Parameter reference in WHERE doesn't work

    - by Gainder
    Hello, I have created a simple static function in oracle 10g to get the reference of an object based on his pk. STATIC FUNCTION getRef(nome IN VARCHAR2) RETURN REF folder_typ IS fl_r REF folder_typ := null; BEGIN SELECT REF(fl) INTO fl_r FROM folder_tab fl WHERE fl.nome = nome; RETURN fl_r; END getRef; This gives me an error because it could't fetch a row. If insted of WHERE fl.nome = nome; I write WHERE fl.nome = 'folder1'; -- it works. I think im not using the parameter in the right way. How can I use it?

    Read the article

  • How to query range of data in DB2 with highest performance?

    - by Fuangwith S.
    Usually, I need to retrieve data from a table in some range; for example, a separate page for each search result. In MySQL I use LIMIT keyword but in DB2 I don't know. Now I use this query for retrieve range of data. SELECT * FROM( SELECT SMALLINT(RANK() OVER(ORDER BY NAME DESC)) AS RUNNING_NO , DATA_KEY_VALUE , SHOW_PRIORITY FROM EMPLOYEE WHERE NAME LIKE 'DEL%' ORDER BY NAME DESC FETCH FIRST 20 ROWS ONLY ) AS TMP ORDER BY TMP.RUNNING_NO ASC FETCH FIRST 10 ROWS ONLY but I know it's bad style. So, how to query for highest performance?

    Read the article

  • How can I kill MySQL queries every 60 seconds in Windows?

    - by Ethan Allen
    I want to check my MySQL server every minute and kill queries that have run longer than 150 seconds. The main reason I want to do this is because I don't want queries from certain people to lock up the DB for everyone else. I know this is not the ultimate solution to the problem, but at least it's a fallback in case something goes wrong with a query. I don't have a slave DB (this is just an at-home project). I'd like to schedule a script to run that does this for me. I'm unfamiliar with Perl or Ruby and I need it done on my Windows 2008 Server box. I've looked into creating a simple cmd line script, but that doesn't seem to be possible. I know currently I can do something like this but I have to do it manually: mysqladmin processlist mysqladmin kill Anyone have any ideas or examples on how I could do this?

    Read the article

  • Copy new records from datatable and identify changes in old records

    - by Betite
    Assume there are two tables: Remote_table and My_table. Remote_table has 6 columns: **PROJECT JOB_TYPE MONTH YEAR** HOURS IS_DELETED 134393 70 1 2013 30 0 134393 70 2 2013 50 0 134393 70 3 2013 80 0 134393 70 10 2012 10 0 134393 70 11 2012 0 0 134393 70 12 2012 15 0 My_table is a copy of remote_table. I tried to copy only the new records from the remote_table by this query: SELECT * FROM [remote_DB].[LudanProjectManager].[dbo].Remote_table EXCEPT SELECT * FROM My_table It works OK but I get a duplicate primary key exception when changes have been made on the remote_table on the hours column. Can anyone think of a way to copy only the new records from remote_table and if changes has been made on old records, to identify them and update the my_table to correspond?

    Read the article

  • How would I implement separate databases for reading and writing operations?

    - by Matt
    I am interested in implementing an architecture that has two databases one for read operations and the other for writes. I have never implemented something like this and have always built single database, highly normalised systems so I am not quite sure where to begin. I have a few parts to this question. 1. What would be a good resource to find out more about this achitecture? 2. Is it just a question of replicating between two identical schemas, or would your schemas differ depending on the operations, would normalisation vary too? 3. How do you insure that data written to one database is immediately available for reading from the second? Any further help, tips, resources would be appreciated. Thanks.

    Read the article

  • Voting Script, Possibility of Simplifying Database Queries

    - by Sev
    I have a voting script which stores the post_id and the user_id in a table, to determine whether a particular user has already voted on a post and disallow them in the future. To do that, I am doing the following 3 queries. SELECT user_id, post_id from votes_table where postid=? AND user_id=? If that returns no rows, then: UPDATE post_table set votecount = votecount-1 where post_id = ? Then SELECT votecount from post where post_id=? To display the new votecount on the web page Any better way to do this? 3 queries are seriously slowing down the user's voting experience Edit In the votes table, vote_id is a primary key In the post table, post_id is a primary key. Any other suggestions to speed things up?

    Read the article

  • Migrating from mssql to firebird: pro and cons

    - by user193655
    i am considering the migration for 3 reasons: 1) SQLSERVER installation is a nightmar, expecially for 1-user software. Software installs in 10 seconds, SQLServer in 1 hour. Firebird installation is much easier. 2) SQLSERVER runs on windows server only 3) My customers have all the express edition 4) i am not using any advanced feature, I am now starting using filestream, but the main reason for this is that Express eidtion has 4/10GB db size limit So these are all Pros of moving to Firebird. Which are the cons? I can also plan to support both platiforms, but this will backfire I fear.

    Read the article

  • DB Interface Design Optimization: Is it better to optimise for Fewer requests of smaller data size?

    - by Overflow
    The prevailing wisdom in webservices/web requests in general is to design your api such that you use as few requests as possible, and that each request returns therefore as much data as is needed In database design, the accepted wisdom is to design your queries to minimise size over the network, as opposed to minimizing the number of queries. They are both remote calls, so what gives?

    Read the article

  • (NOT) NULL for NVARCHAR columns

    - by Anders Abel
    Allowing NULL values on a column is normally done to allow the absense of a value to be represented. When using NVARCHAR there is aldready a possibility to have an empty string, without setting the column to NULL. In most cases I cannot see a semantical difference between an NVARCHAR with an empty string and a NULL value for such a column. Setting the column as NOT NULL saves me from having to deal with the possibility of NULL values in the code and it feels better to not have to different representations of "no value" (NULL or an empty string). Will I run into any other problems by setting my NVARCHAR columns to NOT NULL. Performance? Storage size? Anything I've overlooked on the usage of the values in the client code?

    Read the article

  • How to create a better tables Structure.

    - by user160820
    For my website i have tables Category :: id | name Product :: id | name | categoryid Now each category may have different sizes, for that I have also created a table Size :: id | name | categoryid | price Now the problem is that each category has also different ingredients that customer can choose to add to his purchased product. And these ingredients have different prices for different sizes. For that I also have a table like Ingredient :: id | name | sizeid | categoryid | price I am not sure if this Structure really normalized is. Can someone please help me to optimize this structure and which indexed do i need for this Structure?

    Read the article

  • VB.NET: SQLite to MSSQL

    - by user1736785
    I have a vb.net project that uses a SQLite database. I do this by using dataset/table adapters. The client is happy and all works well. However I have just heard that they plan on providing this product to another customer that wishes to use their MSSQL database. So I am writing this post so I can mentally prepare for this before I begin. I am not a database pro and have really enjoyed the simplicity of setting up and managing an SQLite database. So any ideas on the easiest way to support MSSQL as well? I am happy to run them parallel to each other. Can I just make a separate service / middleware that syncs the SQLite database to the MSSQL on a timer and does not care about what the main app is up to? Any pointers are appreciated.

    Read the article

  • How to: Display multiple related classes in an ASP.NET GridView ?

    - by kversch
    I would like to display students and their grades with a GridView and LinqToSQL like this: assignment1 assignment2 Student 1 55 89 Student 2 87 56 Student 3 92 34 I found this topic but it doesn't answer my question: http://forums.asp.net/t/1557987.aspx I have a many-to-many relationship between students and assignments called "grades". The grade for the assignment is stored in that table in a "gradeNumber" column. I would also like to specify which assignments should be displayed in the grid. Btw, my LINQ entities are extended to allow me to write/get studentx.Assignments or assignmentx.Students.

    Read the article

  • How to write contents of a rails database to external file

    - by user1296787
    I'm trying to have rails send the contents of my database to an external textfile. I wanted this done everytime a new user is created. However, when i try to do the following in my user.rb model file, before_save :write_data def write_data() File.open("data.txt", "w") do |myfile| myfile.write(User.all) end end It doesn't write the actual contents of the database, instead, it displays something like this User:0x109858540 Can anyone help? Thanks.

    Read the article

  • insert a time-stamp value in my db by php

    - by Erick
    I'm using oracle express and in my application i would insert a time-stamp value in my table: $marca = date('y-m-d H:i:s'); $query = " INSERT INTO SA_VERSIONE ( ID_ACCETTAZIONE, MARCA_TEMPORALE, TESTO, FIRMA, MEDICO) VALUES ('$id', '$marca', '$testo', '$firma', '$medico') "; $stid = oci_parse($conn, $query); oci_execute($stid); but when execute it return: Warning: oci_execute() [function.oci-execute]: ORA-01843: mese non valido in ... and say that the month is not valid

    Read the article

  • Syntax error in INSERT INTO statement

    - by user454563
    I wrote a program that connects to MS Access. When I fill in the fields and add a new item to Access the program fails. The exception is "Syntax error in INSERT INTO statement" Here is the relevant code. **************************************************************** AdoHelper.cs **************************************************************** using System; using System.Collections.Generic; using System.Text; using System.Data; using System.Data.OleDb; namespace Yad2 { class AdoHelper { //get the connection string from the app.config file //Provider=Microsoft.ACE.OLEDB.12.0;Data Source=|DataDirectory|\Yad2.accdb static string connectionString = Properties.Settings.Default.DBConnection.ToString(); //declare the db connection static OleDbConnection con = new OleDbConnection(connectionString); /// <summary> /// To Execute queries which returns result set (table / relation) /// </summary> /// <param name="query">the query string</param> /// <returns></returns> public static DataTable ExecuteDataTable(string query) { try { con.Open(); OleDbCommand command = new OleDbCommand(query, con); System.Data.OleDb.OleDbDataAdapter tableAdapter = new System.Data.OleDb.OleDbDataAdapter(command); DataTable dt = new DataTable(); tableAdapter.Fill(dt); return dt; } catch (Exception ex) { throw ex; } finally { con.Close(); } } /// <summary> /// To Execute update / insert / delete queries /// </summary> /// <param name="query">the query string</param> public static void ExecuteNonQuery(string query) { try { con.Open(); System.Data.OleDb.OleDbCommand command = new System.Data.OleDb.OleDbCommand(query, con); command.ExecuteNonQuery(); } catch (Exception ex) { throw ex; } finally { con.Close(); } } /// <summary> /// To Execute queries which return scalar value /// </summary> /// <param name="query">the query string</param> public static object ExecuteScalar(string query) { try { con.Open(); System.Data.OleDb.OleDbCommand command = new System.Data.OleDb.OleDbCommand(query, con); /// here is the Excaption !!!!!!!!! return command.ExecuteScalar(); } catch { throw; } finally { con.Close(); } } } } **************************************************************************** **************************************************************************** DataQueries.cs **************************************************************************** using System; using System.Collections.Generic; using System.Text; using System.Data; namespace Yad2 { class DataQueries { public static DataTable GetAllItems() { try { string query = "Select * from Messages"; DataTable dt = AdoHelper.ExecuteDataTable(query); return dt; } catch (Exception ex) { throw ex; } } public static void AddNewItem(string mesNumber, string title , string mesDate , string contactMail , string mesType , string Details ) { string query = "Insert into Messages values(" + mesNumber + " , '" + title + "' , '" + mesDate + "' , '" + contactMail + "' , , '" + mesType + "' , '" + Details + "')"; AdoHelper.ExecuteNonQuery(query); } public static void DeleteDept(int mesNumber) { string query = "Delete from Item where MessageNumber=" + mesNumber; AdoHelper.ExecuteNonQuery(query); } } } *********************************************************************************************** plase help me .... why the program falls ?

    Read the article

  • If a table has two xml columns, will inserting records be a lot slower?

    - by Lieven Cardoen
    Is it a bad thing to have two xml columns in one table? + How much slower are these xml columns in terms of updating/inserting/reading data? In profiler this kind of insert normally takes 0 ms, but sometimes it goes up to 160ms: declare @p8 xml set @p8=convert(xml,N'<interactions><interaction correct="false" score="0" id="0" gapid="0" x="61" y="225"><feedback/><element id="0" position="0" elementtype="1"><asset/></element></interaction><interaction correct="false" score="0" id="1" gapid="1" x="64" y="250"><feedback/><element id="0" position="0" elementtype="1"><asset/></element></interaction><interaction correct="false" score="0" id="2" gapid="2" x="131" y="250"><feedback/><element id="0" position="0" elementtype="1"><asset/></element></interaction></interactions>') declare @p14 xml set @p14=convert(xml,N'<contentinteractions/>') exec sp_executesql N'INSERT INTO [dbo].[PackageSessionNodes]([dbo].[PackageSessionNodes].[PackageSessionId], [dbo].[PackageSessionNodes].[TreeNodeId],[dbo].[PackageSessionNodes].[Duration], [dbo].[PackageSessionNodes].[Score],[dbo].[PackageSessionNodes].[ScoreMax], [dbo].[PackageSessionNodes].[Interactions],[dbo].[PackageSessionNodes].[BrainTeaser], [dbo].[PackageSessionNodes].[DateCreated], [dbo].[PackageSessionNodes].[CompletionStatus], [dbo].[PackageSessionNodes].[ReducedScore], [dbo].[PackageSessionNodes].[ReducedScoreMax], [dbo].[PackageSessionNodes].[ContentInteractions]) VALUES (@ins_dboPackageSessionNodesPackageSessionId, @ins_dboPackageSessionNodesTreeNodeId, @ins_dboPackageSessionNodesDuration, @ins_dboPackageSessionNodesScore, @ins_dboPackageSessionNodesScoreMax, @ins_dboPackageSessionNodesInteractions, @ins_dboPackageSessionNodesBrainTeaser, @ins_dboPackageSessionNodesDateCreated, @ins_dboPackageSessionNodesCompletionStatus, @ins_dboPackageSessionNodesReducedScore, @ins_dboPackageSessionNodesReducedScoreMax, @ins_dboPackageSessionNodesContentInteractions) ; SELECT SCOPE_IDENTITY() as new_id This is the table: CREATE TABLE [dbo].[PackageSessionNodes]( [PackageSessionNodeId] [int] IDENTITY(1,1) NOT NULL, [PackageSessionId] [int] NOT NULL, [TreeNodeId] [int] NOT NULL, [Duration] [int] NULL, [Score] [float] NOT NULL, [ScoreMax] [float] NOT NULL, [Interactions] [xml] NOT NULL, [BrainTeaser] [bit] NOT NULL, [DateCreated] [datetime] NULL, [CompletionStatus] [int] NOT NULL, [ReducedScore] [float] NOT NULL, [ReducedScoreMax] [float] NOT NULL, [ContentInteractions] [xml] NOT NULL, CONSTRAINT [PK_PackageSessionNodes] PRIMARY KEY CLUSTERED ( [PackageSessionNodeId] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO ALTER TABLE [dbo].[PackageSessionNodes] WITH CHECK ADD CONSTRAINT [FK_PackageSessionNodes_PackageSessions] FOREIGN KEY([PackageSessionId]) REFERENCES [dbo].[PackageSessions] ([PackageSessionId]) ON UPDATE CASCADE ON DELETE CASCADE GO ALTER TABLE [dbo].[PackageSessionNodes] CHECK CONSTRAINT [FK_PackageSessionNodes_PackageSessions] GO ALTER TABLE [dbo].[PackageSessionNodes] WITH CHECK ADD CONSTRAINT [FK_PackageSessionNodes_TreeNodes] FOREIGN KEY([TreeNodeId]) REFERENCES [dbo].[TreeNodes] ([TreeNodeId]) GO ALTER TABLE [dbo].[PackageSessionNodes] CHECK CONSTRAINT [FK_PackageSessionNodes_TreeNodes] GO ALTER TABLE [dbo].[PackageSessionNodes] ADD CONSTRAINT [DF_PackageSessionNodes_Score] DEFAULT ((-1)) FOR [Score] GO ALTER TABLE [dbo].[PackageSessionNodes] ADD CONSTRAINT [DF_PackageSessionNodes_ScoreMax] DEFAULT ((-1)) FOR [ScoreMax] GO ALTER TABLE [dbo].[PackageSessionNodes] ADD CONSTRAINT [DF_PackageSessionNodes_DateCreated] DEFAULT (getdate()) FOR [DateCreated] GO ALTER TABLE [dbo].[PackageSessionNodes] ADD CONSTRAINT [DF_PackageSessionNodes_ReducedScore] DEFAULT ((-1)) FOR [ReducedScore] GO ALTER TABLE [dbo].[PackageSessionNodes] ADD CONSTRAINT [DF_PackageSessionNodes_ReducedScoreMax] DEFAULT ((-1)) FOR [ReducedScoreMax] GO

    Read the article

  • Exporting many tables on Oracle

    - by Adomas
    Hi, I would like to know, how to export many tables from oracle DB. I use exp.exe, create file expdat.dmp and so on. I choose to export only tables and there I must write which ones. Is there any chance of getting all of them? thanks

    Read the article

  • Update with inner join ?

    - by phenevo
    I have two databases: DB1 and DB2 How to do something like: update myServer.DB1.dbo.hotels.Name = myServer.DB2.dbo.hotels.Name join myServer.DB2.dbo.hotels on myServer.DB2.dbo.hotels.Code= myServer.DB1.dbo.hotels.Code where myServer.DB2.dbo.hotels.CountryCoe != myServer.DB1.dbo.hotels.CountryCode

    Read the article

  • Strange behavior with large Object Types

    - by Peter Lang
    I recognized that calling a method on an Oracle Object Type takes longer when the instance gets bigger. The code below just adds rows to a collection stored in the Object Type and calls the empty dummy-procedure in the loop. Calls are taking longer when more rows are in the collection. When I just remove the call to dummy, performance is much better (the collection still contains the same number of records): Calling dummy: Not calling dummy: 11 0 81 0 158 0 Code to reproduce: Create Type t_tab Is Table Of VARCHAR2(10000); Create Type test_type As Object( tab t_tab, Member Procedure dummy ); Create Type Body test_type As Member Procedure dummy As Begin Null; --# Do nothing End dummy; End; Declare v_test_type test_type := New test_type( New t_tab() ); Procedure run_test As start_time NUMBER := dbms_utility.get_time; Begin For i In 1 .. 200 Loop v_test_Type.tab.Extend; v_test_Type.tab(v_test_Type.tab.Last) := Lpad(' ', 10000); v_test_Type.dummy(); --# Removed this line in second test End Loop; dbms_output.put_line( dbms_utility.get_time - start_time ); End run_test; Begin run_test; run_test; run_test; End; I tried with both 10g and 11g. Can anyone explain/reproduce this behavior?

    Read the article

  • Determine caller within stored proc or trigger

    - by Mike Clark
    I am working with an insert trigger within a Sybase database. I know I can access the @@nestlevel to determine whether I am being called directly or as a result of another trigger or procedure. Is there any way to determine, when the nesting level is deeper than 1, who performed the action causing the trigger to fire? For example, was the table inserted to directly, was it inserted into by another trigger and if so, which one.

    Read the article

< Previous Page | 644 645 646 647 648 649 650 651 652 653 654 655  | Next Page >