Search Results

Search found 34476 results on 1380 pages for 'sql blog'.

Page 481/1380 | < Previous Page | 477 478 479 480 481 482 483 484 485 486 487 488  | Next Page >

  • Use Javascript RegEx to extract column names from SQLite Create Table SQL

    - by NimbusSoftware
    I'm trying to extract column names from a SQLite result set from sqlite_master's sql column. I get hosed up in the regular expressions in the match() and split() functions. t1.executeSql('SELECT name, sql FROM sqlite_master WHERE type="table" and name!="__WebKitDatabaseInfoTable__";', [], function(t1, result) { for(i = 0;i < result.rows.length; i++){ var tbl = result.rows.item(i).name; var dbSchema = result.rows.item(i).sql; // errors out on next line var columns = dbSchema.match(/.*CREATE\s+TABLE\s+(\S+)\s+\((.*)\).*/)[2].split(/\s+[^,]+,?\s*/); } }, function(){console.log('err1');} ); I want to parse SQL statements like these... CREATE TABLE sqlite_sequence(name,seq); CREATE TABLE tblConfig (Key TEXT NOT NULL,Value TEXT NOT NULL); CREATE TABLE tblIcon (IconID INTEGER NOT NULL PRIMARY KEY,png TEXT NOT NULL,img32 TEXT NOT NULL,img64 TEXT NOT NULL,Version TEXT NOT NULL) into a strings like theses... name,seq Key,Value IconID,png,img32,img64,Version Any help with a RegEx would be greatly appreciated.

    Read the article

  • IIS 7.5 fails to open database after computer machine on that database server is working restarts.

    - by Jenea
    Hi. I decided to post this question also here in case the issues we have is related to sql server. There is a problem that bother me for some time. I have an asp.net mvc that uses NHibernate for modeling the database. The infrastructure is the following: Windows 2008 R2 for all virtual machines. IIS 7.5 is working on one virtual machine. Sql Server 2008 is working on another virtual machine. We have couple of databases, two that stores application data and one that registers all unhandled exceptions. Sometimes virtual machine that hosts database server restarts (in the middle of the night, not quite sure about the reason) after that connection to the databases that stores application data is not working and as result there are thousands of unhandled exceptions that get registered in the third database. Important to mention that databases are accessible from Management Studio. The problem is solved by resetting IIS. Connetion are handled via NHibernateUtil class which opens and closes session at each request.

    Read the article

  • TransactionScope and Transactions

    - by Mike
    In my C# code I am using TransactionScope because I was told not to rely that my sql programmers will always use transactions and we are responsible and yada yada. Having said that It looks like TransactionScope object Rolls back before the SqlTransaction? Is that possible and if so what is the correct methodology for wrapping a TransactionScope in a transaction. Here is the sql test CREATE PROC ThrowError AS BEGIN TRANSACTION --SqlTransaction SELECT 1/0 IF @@ERROR<> 0 BEGIN ROLLBACK TRANSACTION --SqlTransaction RETURN -1 END ELSE BEGIN COMMIT TRANSACTION --SqlTransaction RETURN 0 END go DECLARE @RESULT INT EXEC @RESULT = ThrowError SELECT @RESULT And if I run this I get just the divide by 0 and return -1 Call from the C# code I get an extra error message Divide by zero error encountered. Transaction count after EXECUTE indicates that a COMMIT or ROLLBACK TRANSACTION tatement is missing. Previous count = 1, current count = 0. If I give the sql transaction a name then Cannot roll back SqlTransaction. No transaction or savepoint of that name was found. Transaction count after EXECUTE indicates that a COMMIT or ROLLBACK TRANSACTION statement is missing. Previous count = 1, current count = 2. some times it seems the count goes up, until the app completely exits The c# is just using (TransactionScope scope = new TransactionScope()) { ... Execute Sql scope.Commit() }

    Read the article

  • How to configure Windows user accounts for ODBC network with NT authentication?

    - by Ian Mackinnon
    I'm trying to create a connection to an SQL Server database from the ODBC Data Source Administrator using "Windows NT authentication using the network login ID". Both server and client are running Windows XP. It appears that any account with administrator privileges can add the data source on the server*, though connection attempts from the client result in error messages that suggest it is trying to authenticate using a guest account. I found a Microsoft support page that says: For SQL Server...: connect using the impersonated user account. But it doesn't offer advice about how to do that. How do I impersonate a user account on the server? or (since it sounds like that would lead to an unfortuante squashing of privileges and loss of accountability): How do I give an account on the client privileges on the server database and then ensure the client attempts authentication with the privileged account and not with a guest account? I'm aware that I'm providing rather sparse information. This is because I'm in unfamiliar territory and don't know what's pertinent. I'll attempt to add any requested information as quickly as possible. *I'm planning on tightening privileges straight after I get it working as it stands.

    Read the article

  • Perfectly reproducable select statement default ordering issue....

    - by Dave
    Hi, I've recently been chasing an issue with a client's db... solution found, but impossible to recreate. Essentially, we're doing a Select * from mytable where ArbitraryColumn = 75 Where MyTable has an Identity column, called 'MyIndentityColumn' - incremented by one in each insert. Naturally, and normally I would assume that the order returned would be the order in which they are inserted (bad assumption, but one which was forced onto me, through an inherited application - which has been patched). Essentially, I would like suggestions as to why the database, when restored to my local machine (same OS, same SQL server version - 200 sp3) same collation, and same backup instance restored on it, as a test DB on the client site. When I perform the above select, I get them in order of insert (i.e. identity column ordered ascending). On the client, it seems random (but the same 'random' order each time)... A few other points: I have the same collation on my test server as client Same DB backup restored to a test only I can access Same SQL server version and service pack Same OS Test DB is a new DB - new log and MDF... I have the problem 'solved' by adding an explicit order by clause but I want to undertand the cause of the issue, given the exact nature of my attempts to recreate it beuing futile, and perfectly recreatable on the client server... Thanks in advance, Dave

    Read the article

  • Yii problem in blog tutorial

    - by Kani
    When i login occurs following problem. PHP Error Description include(User.php) [<a href='function.include'>function.include</a>]: failed to open stream: No such file or directory Source File D:\Badrakh\xampp\htdocs\yii\framework\YiiBase.php(395) 00383: * @return boolean whether the class has been loaded successfully 00384: */ 00385: public static function autoload($className) 00386: { 00387: // use include so that the error PHP file may appear 00388: if(isset(self::$_coreClasses[$className])) 00389: include(YII_PATH.self::$_coreClasses[$className]); 00390: else if(isset(self::$classMap[$className])) 00391: include(self::$classMap[$className]); 00392: else 00393: { 00394: if(strpos($className,'\\')===false) 00395: include($className.'.php'); 00396: else // class name with namespace in PHP 5.3

    Read the article

  • Compressing and copying large files on Windows Server?

    - by Aaron
    I've been having a hard time copying large database backups from the database server to a test box at another site. I'm open to any ideas that would help me get this database moved without having to resort to a USB hard drive and the mail. The database server is running Windows Server 2003 R2 Enterprise, 16 GB of RAM and two quad-core 3.0 GHz Xeon X5450s. Files are SQL Server 2005 backup files between 100 GB and 250 GB. The pipe is not the fastest and SQL Server backup files typically compress down to 10-40% of the original, so it made sense to me to compress the files first. I've tried a number of methods, including: gzip 1.2.4 (UnxUtils) and 1.3.12 (GnuWin) bzip2 1.0.1 (UnxUtils) and 1.0.5 (Cygwin) WinRAR 3.90 7-Zip 4.65 (7za.exe) I've attempted to use WinRAR and 7-Zip options for splitting into multiple segments. 7za.exe has worked well for me for database backups on another server, which has ~50 GB backups. I've also tried splitting the .BAK file first with various utilities and compressing the resulting segments. No joy with that approach either- no matter the tool I've tried, it ends up butting against the size of the file. Especially frustrating is that I've transferred files of similar size on Unix boxes without problems using rsync+ssh. Installing an SSH server is not an option for the situation I'm in, unfortunately. For example, this is how 7-Zip dies: H:\dbatmp>7za.exe a -t7z -v250m -mx3 h:\dbatmp\zip\db-20100419_1228.7z h:\dbatmp\db-20100419_1228.bak 7-Zip (A) 4.65 Copyright (c) 1999-2009 Igor Pavlov 2009-02-03 Scanning Creating archive h:\dbatmp\zip\db-20100419_1228.7z Compressing db-20100419_1228.bak System error: Unspecified error

    Read the article

  • Converting ntext to nvcharmax(max) - Getting around size limitation

    - by Overflew
    Hi all, I'm trying to change an existing SQL NText column to nvcharmax(max), and encountering an error on the size limit. There's a large amount of existing data, some of which is more than the 8k limit, I believe. We're looking to convert this, so that the field is searchable in LINQ. The 2x SQL statements I've tried are: update Table set dataNVarChar = convert(nvarchar(max), dataNtext) where dataNtext is not null update Table set dataNVarChar = cast(dataNtext as nvarchar(max)) where dataNtext is not null And the error I get is: Cannot create a row of size 8086 which is greater than the allowable maximum row size of 8060. This is using SQL Server 2008. Any help appreciated, Thanks. Update / Solution: The marked answer below is correct, and SQL 2008 can change the column to the correct data type in my situation, and there are no dramas with the LINQ-utilising application we use on top of it: alter table [TBL] alter column [COL] nvarchar(max) I've also been advised to follow it up with: update [TBL] set [COL] = [COL] Which completes the conversion by moving the data from the lob structure to the table (if the length in less than 8k), which improves performance / keeps things proper.

    Read the article

  • Show wordpress posts outside of the blog

    - by way2project
    I have two wordpress blogs on my site and now I want to show the posts of categories of both blogs on the home page. I am using the following code. <?php require($_SERVER['DOCUMENT_ROOT'] . '/projects/wp-load.php'); query_posts('cat=9& showposts=8'); if (have_posts()) : while (have_posts()) : the_post(); ?> <ul><li> <a href="<?php the_permalink(); ?>" title="<?php the_title(); ?>"><?php $shorttitle4 = substr(the_title('','',FALSE),0,25)."..."; echo $shorttitle4; ?></a> </li></ul> <?php endwhile; else: echo "no posts"; endif; ?> <?php wp_reset_query(); ?> One of my blogs is placed in the "projects" folder and another one is placed in "technology folder". but this code is showing the posts only from projects blogs if I changed the folder in as "technology" in the above code. I think this is because of wp-load.php file. Can you help me? Thanks

    Read the article

  • MySQL ADO.NET Connector & MSSQL Integration Services

    - by user1114330
    Here I am, day three... attempting to sync a data view on a Windows Vista box (64 bit) running MSSQL 2012 and Visual Studio 2010. Sanity is slipping and hunger for progress fills my attention. I went through hell trying to get the MySQL ODBC drivers to get the job but to no avail...everyone seems to be lost and all the threads I can find are solutions that do not work for me. The problem: System DSN's not being seen by SSIS. SSIS DSN Not Showing as ODBC Data Source I make the decision to try out the ADO.NET connector...and to my surprise it is actually in the selection list in data sources in SSIS. So I take off running to create a Data Flow Task, create an ADO.NET Source (a local MSSQL DB)...all is good as usual. Then I move swiftly to creating a ADO.NET Destination, enter my credentials...wow, I am selecting a database finally on my linux server! Happy thinking that I finally have figured a way to get the job done. Then I move to mappings...nope, something is wrong...I am getting an error that hurts my eyes: Pipeline component has returned HRESULT error code 0xC0208457 from a a method call. Error at Data Flow Task [ADO NET Destination [81]]: Failed to get properties of external columns. The table name you entered may not exist or you do not have SELECT permission on the table object and an alternative attempt to get column properties through connection has failed. Detailed error messages are" You have an error in your SQL syntax check the manual that corresponds to your MySQL server version for the right syntax to use near "database".tablename" at line 1. The descriptor files on path C:\Program Files (x86)\Microsoft SQL Server\110\DTS\ProviderDescriptors\ does not contain schema information for connection of type MySQL.Data.MySqlClient.MySqlConnection. So it looks like it can't the information and therefore I cannot map the tables properly. Any ideas on this would be ultra helpful...thanks in advance to All!

    Read the article

  • Missing drive space in Server 2003

    - by Tim Brigham
    I have two drives used for SQL backups which for the last week have been acting strange - the free space indicated by windows is far off from what windirstat, etc indicates. There should only be about 60 GB of drive space used and there is about 160. This would match the utilization if the two last backup files were still residing on disk. SQL server is 2000, OS Server 2003 x64. Running on a VMware 5.0 cluster. OSSEC and McAfee for this system shows clean. My current plan is to temporarily attach one of these drives this drive to another VM for analysis. Is there anything more I should be looking at? There were a lot of pages on the net when I was looking for documentation on this issue but I haven't found this case described. EDIT: Unfortunately even a full reboot did not clear this behavior. I also used process explorer to look for open file handles. No dice.

    Read the article

  • Full-text search locks up database - error 0x8001010e

    - by Stewart May
    Hi We have a full-text catalog that is populated via a job every 15 minutes like so: ALTER FULLTEXT INDEX ON [dbo].[WorkItemLongTexts] START INCREMENTAL POPULATION We have encountered a problem where the database containing this catalog locks up. There are a couple of scenarios, we either see the job execute and the process hang with with a wait type of UNKNOWN TOKEN, or we see another process hang with a wait type of MSSEARCH. Once this happens the job continues to run but informs us that the request to start a full-text index population is ignored because a population is currently active. Looking in the full text log files we see the following error each time these problems occur: 2010-04-21 08:15:00.76 spid21s The full-text catalog health monitor reported a failure for full-text catalog "XXXFullTextCatalog" (5) in database "YYY" (14). Reason code: 0. Error: 0x8001010e(The application called an interface that was marshalled for a different thread.). The system will restart any in-progress population from the previous checkpoint. If this message occurs frequently, consult SQL Server Books Online for troubleshooting assistance. This is an informational message only. No user action is required."'' The only solution is to restart the SQL Server service and then the full text service. This is now occuring on a daily basis now so any help would be appreciated.

    Read the article

  • How bad is opening and closing a SQL connection for several times? What is the exact effect?

    - by Eren
    For example, I need to fill lots of DataTables with SQLDataAdapter's Fill() method: DataAdapter1.Fill(DataTable1); DataAdapter2.Fill(DataTable2); DataAdapter3.Fill(DataTable3); DataAdapter4.Fill(DataTable4); DataAdapter5.Fill(DataTable5); .... .... Even all the dataadapter objects use the same SQLConnection, each Fill method will open and close the connection unless the connection state is already open before the method call. What I want to know is how does unnecessarily opening and closing SQLConnections affect the performance of the application. How much does it need to scale to see the bad effects of this problem (100,000s of concurrent users?). In a mid-size website (daily 50000 users) does it worth bothering and searching for all the Fill() calls, keeping them together in the code and opening the connection before any Fill() call and closing afterwards?

    Read the article

  • Entity Sql Group By problem, please help

    - by Zviadi
    Hello, help me please with this simple E-sql query: var qStr = "SELECT SqlServer.Month(o.DatePaid) as month, SqlServer.Sum(o.PaidMoney) as PaidMoney FROM XACCModel.OrdersIncomes as o group by SqlServer.Month(o.DatePaid)"; heres what I have. I have simple Entity called OrdersIncomes with ID,PaidMoney,DatePaid,Order_ID properties I want to select Month and Summed PaidMoney like this: month Paidmoney 1 500 2 700 3 1200 T-SQL looks like this and works fine: select MONTH(o.DatePaid), SUM(o.PaidMoney) from OrdersIncomes as o group by MONTH(o.DatePaid) results: 3 31.0000 4 127.0000 5 20.0000 (3 row(s) affected) but E-SQL doesnot work and I dont know what to do. here my E-SQL which needs refactoring: var qStr = "SELECT SqlServer.Month(o.DatePaid) as month, SqlServer.Sum(o.PaidMoney) as PaidMoney FROM XACCModel.OrdersIncomes as o group by SqlServer.Month(o.DatePaid)"; theres exception: ErrorDescription = "The identifier 'o' is not valid because it is not contained either in an aggregate function or in the GROUP BY clause." if I include o in group by clause, like: FROM XACCModel.OrdersIncomes as o group by o then I dont get summed and agregated results. is this some bug? or what Im doing wrong. heres Linq to Entities query and it works too: var incomeResult = from ic in _context.OrdersIncomes group ic by ic.DatePaid.Month into gr select new { Month = gr.Key, PaidMoney = gr.Sum(i = i.PaidMoney) };

    Read the article

  • Problem running “Central Administration” website after windows update at Windows 2003 Server Standar

    - by Magdy Roshdy
    I was have WSS 2.0 and then I upgraded to WSS 3.0 and the old instalation database was SQL 2000, now I have another SQL Server instance called:server_name\MICROSOFT##SSEE . After upgrade every thing works fine and our team started to use the portal and we sent lot of documents and make lot of activities on it. The problem started after installing Windows updates the website suddenly stopped and giving me an error "Cannot connect to the configuration database" If I tried to open SharePoint Products and Technologies Configuration Wizard it is gives me a strange error says: "An exception of type Microsoft.SharePoint.PostSetupConfiguration.PostSetupConfigurationTaskException was thrown. Additional exception information: SharePoint Products and Technologies cannot be configured. The current installation mode does not support SKU to SKU upgrades because there exists an older version of Windows SharePoint Services that must be upgraded first " At this post:http://stackoverflow.com/questions/114398/iis-error-cannot-connect-to-the-configuration-database/249494#249494 the guy of the second answer have the same problem and he suggested a solution but I don't understand well. I tried as he suggested to make the identity of the app pool of the SharePoint web site as "IWAM_server_name " after that the error changed as he said and I web site give me "Server Application Unavailable " and when checked the Event Viewer at the server I found that ASP.NET 2.0 give this exception: "Could not load file or assembly 'System.Web, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. Access is denied ." and I don't know how to solve this problem. I'm really want to make my web site working because our team really need these documents and its stuff. I hope I will find some one to help me.

    Read the article

  • Does a Postgresql dump create sequences that start with - or after - the last key?

    - by bennylope
    I recently created a SQL dump of a database behind a Django project, and after cleaning the SQL up a little bit was able to restore the DB and all of the data. The problem was the sequences were all mucked up. I tried adding a new user and generated the Python error IntegrityError: duplicate key violates unique constraint. Naturally I figured my SQL dump didn't restart the sequence. But it did: DROP SEQUENCE "auth_user_id_seq" CASCADE; CREATE SEQUENCE "auth_user_id_seq" INCREMENT 1 START 446 MAXVALUE 9223372036854775807 MINVALUE 1 CACHE 1; ALTER TABLE "auth_user_id_seq" OWNER TO "db_user"; I figured out that a repeated attempt at creating a user (or any new row in any table with existing data and such a sequence) allowed for successful object/row creation. That solved the pressing problem. But given that the last user ID in that table was 446 - the same start value in the sequence creation above - it looks like Postgresql was simply trying to start creating rows with that key. Does the SQL dump provide the wrong start key by 1? Or should I invoke some other command to start sequences after the given start ID? Keenly curious.

    Read the article

  • Help with interesting VS2010 and SQL2008 bug

    - by user355770
    Hey So im using Visual Studio 2010 to create a webpage. im calling some tabels from sql server 2008 here is where im confused... The code runs fine no errors. The pages works except im missing my rows in my 3rd column from the table. Everything else shows up. Ive checked to make sure the names are matching everywhere and that in sql the joins and such worked. Its just very weird that i'd be missing my 2 rows from the 3rd column. anyone have any ideas to help?? the error is in the tab called research material else if (tabTagId == "tpArlington_ProjectInformation") { repArlington_ProjectInformation.DataSource = ds; repArlington_ProjectInformation.DataBind(); } else if (tabTagId == "tpArlington_Plan") { repArlington_Plan.DataSource = ds; repArlington_Plan.DataBind(); } else if (tabTagId == "tpArlington_ResearchMaterial") { repArlington_ResearchMaterial.DataSource = ds; repArlington_ResearchMaterial.DataBind(); } else if (Session["projectAbbreviation"].ToString() == "ARLING") { tpArlington_ProjectInformation.HeaderText = "Project Information"; tpArlington_ProjectInformation.Visible = true; tpArlington_Plan.HeaderText = "Plan"; tpArlington_Plan.Visible = true; tpArlington_ResearchMaterial.HeaderText = "ResearchMaterial"; tpArlington_ResearchMaterial.Visible = true; getTabData("tpArlington_ProjectInformation"); getTabData("tpArlington_Plan"); getTabData("tpArlington_ReasearchMaterial"); } the 2 other tabs work perfect. the research material is where the problem is. the stuff in the tab doesn't come up. the text in the tab DOES come up but not the stuff from sql. the stuff in sql looks good, the ids match and everything is joined properly otherwise the other 2 tabs wouldnt work. that is what is confusing me. any suggestions or specific info you need just ask. thanks!

    Read the article

  • LINQ to Entites - Left Outer Join - SQL 2000

    - by user255234
    Hi! I'm using Linq to Entities. I have the following query in my code, it includes left outer Join: var surgeonList = (from item in context.T1_STM_Surgeon.Include("T1_STM_SurgeonTitle") .Include("OTER").Include("OSLP") join reptable in context.OSLP on item.Rep equals reptable.SlpCode into surgRepresentative where item.ID == surgeonId select new { ID = item.ID, First = item.First, Last = item.Last, Rep = (surgRepresentative.FirstOrDefault() != null) ? surgRepresentative.FirstOrDefault().SlpName : "N/A", Reg = item.OTER.descript, PrimClinic = item.T1_STM_ClinicalCenter.Name, Titles = item.T1_STM_SurgeonTitle, Phone = item.Phone, Email = item.Email, Address1 = item.Address1, Address2 = item.Address2, City = item.City, State = item.State, Zip = item.Zip, Comments = item.Comments, Active = item.Active, DateEntered = item.DateEntered }) .ToList(); My DEV server has SQL 2008, so the code works just fine. When I moved this code to client's production server - they use SQL 2000, I started getting "Incorrect syntax near '(' ". I've tried changing the ProviderManifestToken to 2000 in my .edmx file, then I started getting "The execution of this query requires the APPLY operator, which is not supported in versions of SQL Server earlier than SQL Server 2005." I tied changing the token to 2005, the "Incorrect syntax near '(' " is back. Can anybody help me to find a workaround for this? Thank you very much in advance!

    Read the article

  • Ant build script executing <sql> task using java code

    - by Jay
    Any idea, why none of the debugging comments are printed once after executing the ANT build script's SQL task via java code? The java class to execute the sql in build scirpt is public class AntRunnerTest { private Project project; public void executeTask(String taskName) { try { project = new Project(); project.init(); project.setBasedir(new String(".")); ProjectHelper helper = ProjectHelper.getProjectHelper(); project.addReference("ant.projectHelper", helper); helper.parse(project, new File("build-copy.xml")); System.out.println("Before"); project.executeTarget(taskName); System.out.println("After"); } catch(Exception ex) { System.out.println(ex.getMessage()); } } public static void main(String args[]) { try { AntRunnerTest newInst = new AntRunnerTest(); newInst.executeTask("sql"); } catch(Exception e) { System.out.println(""+e); } } } I dont see the debug String "After" getting printed in the console. I noticed this issue only when i try to execute a sql task using java code. The ant script has the following simple transaction tag in it. <transaction> <![CDATA[ select now() ]]> </transaction> Any thoughts? Thanks in advance.

    Read the article

  • PDO closeCursor Error

    - by Metropolis
    Hey Everyone, I currently have a database layer that I wrote myself and I have been using it now for over a year without any problems. The database class uses PDO, and there are two different databases that I regularly connect to (MySQL and MS SQL). The MS SQL database is used for Accpac accounting storage, and the MySQL database is used for everything else. In one of the MySQL databases I have all of the dsn's listed which I use to create the string I need to connect to the MS SQL databases. I have a new program I am trying to write which I am taking employee data from one of the MySQL databases, and using the employee ID to get the employee's information from the MS SQL database. For some reason, whenever I run the program it will get through about 1200 records (out of 11k) and then crash with an error like the following, Fatal error: Call to a member function closeCursor() on a non-object I have tried moving the loops around in many different ways, and I have tried manually closing the connections by setting the database handle to null. Nothing I do seems to work. Thanks for any help! Metropolis

    Read the article

< Previous Page | 477 478 479 480 481 482 483 484 485 486 487 488  | Next Page >