Search Results

Search found 5233 results on 210 pages for 'a records'.

Page 154/210 | < Previous Page | 150 151 152 153 154 155 156 157 158 159 160 161  | Next Page >

  • sqlite COUNT in flex returning [object Object]

    - by Adam
    I'm sure this is an easy questions and I'm just doing something stupid but I'm really new to all this code. I'm trying to run a sqlite query in flex to count the total number of records I believe its working fine but I just can't figure out how to display the results - all I get back is [object Object]. private function overviewOne():void{ var stmt:SQLStatement = new SQLStatement(); stmt.sqlConnection = sqlConn; stmt.text = "SELECT COUNT(user_id) FROM tbl_user WHERE status_status ='Away'"; stmt.execute(); var result:SQLResult = stmt.getResult(); acoverviewOne = new Array(result.data); trace (result.data[0]); }

    Read the article

  • Use parameters with CTL

    - by Hal
    Hi. I am using a CTL file to load data stored in a file to a specific table in my Oracle database. Currently, I launch the loader file using the following command line: sqlldr user/pwd@db data=my_data_file control=my_loader.ctl I would like to know if it is possible to use specify parameters to be retrieved in the CTL file. Also, is it possible to retrieve the name of the data file used by the CTL to fill the table ?I also would like to insert it for each row. I currently have to call a procedure to update previously inserted records. Any help would be appreciated !

    Read the article

  • Scheduled cron job to check for pending activity

    - by luckytaxi
    Using PHP ... This is for my personal use so I'm thinking maybe 3-4 emails a day. I'm at a point where I can send an email to a dedicated email address where my script parses the message and stores it into a DB. Now, I need to figure out the best way to check the records in the DB for any upcoming task. I feel like I'm missing something, maybe like a trigger field as to when a reminder should go out. However, that's not a concern to me at the moment since I'll just send an alert 15 mins prior to the due date. Question is, shoudl I run a cron job that queries the DB every minute? I take it the query will have to say something like "select all tasks that is due within 15 minutes."

    Read the article

  • Am I underestimating MySQL ?

    - by user281434
    Hi I'm about to implement a feature on my website that recommends content to users based on the content they already have in their library (a la Last.fm). A single table holds all the records of the content they have added, so a row might look something like: -------------------- | userid | content | -------------------- | 28 | a | -------------------- When I want to recommend some content for a user, I use a query to get all the user id's that have content a added in their library. Then, out of those user id's, I make another query that finds the next most common content among those users (fx. 'b'), and show that to the user. My problem is when I'm thinking about the big picture here. Say that eventually my site will hold something like 500.000 rows in the table, will this make the MySQL response very slow or am I underestimating MySQL here?

    Read the article

  • check if lookup yields any valid rows for insertion before clearing table using ssis

    - by Chris
    SSIS ignoramus needing help! the situation: a temp table is populated from an excel file, which has been known to change formats at random times, that is owned by a different group. a lookup need to be performed on the temp table, tableA, to populate tableB with valid data. if the lookup results in 0 rows being returned, an email should be sent and the existing data in tableB should remain untouched. If the lookup results in a number of valid rows 0, tableB should have all rows deleted and the new records from the lookup on tableA inserted. question: what would be the best way to check if there are any valid rows and perform the appropriate action(s), depending on my results? Thanks!

    Read the article

  • Unique number identifier generation

    - by xwrs
    I have to create logic for generation unique number identifier for records in database. id, generated in database is a separate column. At this moment, when user calls "create record" action, I save new record, get its database id, generate record number using this id, then put it to the edit form. Using this way means that all entity fields should be nullable to save record to database. I don't like this way. I know that should be better way. Is there a better practice to generate unique number identifier? What is possibility of generating non-unique random numbers? Thank you

    Read the article

  • Problem with nhibernate join

    - by MexicanHacker
    I'm trying to do a join like this using fluent nhibernate: Id(x => x.Id); Map(x => x.SourceSystemRecordId,"sourceSystemRecord_id"); Then Join("cat.tbl_SourceSystemRecords", SourceSystemRecords); But, it seems I don't have a way to specify the column I want to join with from the first table, in this case I need to join on SourceSystemRecordId and not on Id Is there any way I can specify this? I tried References() but that requires me to create an object for this relationship, what I need is to aggregate the columns in sourcesystem records to the ones in the main table.

    Read the article

  • Insert into select and update in single query

    - by Ossi
    I have 4 tables: tempTBL, linksTBL and categoryTBL, extra on my tempTBL I have: ID, name, url, cat, isinserted columns on my linksTBL I have: ID, name, alias columns on my categoryTBL I have: cl_id, link_id,cat_id on my extraTBL I have: id, link_id, value How do I do a single query to select from tempTBL all items where isinsrted = 0 then insert them to linksTBL and for each record inserted, pickup ID (which is primary) and then insert that ID to categoryTBL with cat_id = 88. after that insert extraTBL ID for link_id and url for value. I know this is so confusing, put I'll post this anyhow... This is what I have so far: INSERT IGNORE INTO linksTBL (link_id,link_name,alias) VALUES(NULL,'tex2','hello'); # generate ID by inserting NULL INSERT INTO categoryTBL (link_id,cat_id) VALUES(LAST_INSERT_ID(),'88'); # use ID in second table I would like to add here somewhere that it only selects items where isinserted = 0 and iserts those records, and onse inserted, will change isinserted to 1, so when next time it runs, it will not add them again.

    Read the article

  • What can cause reset connection when running PHP script?

    - by marcin_koss
    I've developed a web application with CodeIgniter that works perfectly on my local machines (one with windows and one with Linux). When I moved it to my hosting server, connection gets reset when running one particular PHP script that does a few MySQL queries and some operations on arrays. The data I'm querying is small, just a few tables with up to 25 records. Firefox returns "The connection was reset" after maybe 2-3 seconds. I checked the servers error logs but there was nothing there. Unfortunately I don't have access to Apache error logs. What can cause this behavior?

    Read the article

  • How to perform DNS query on iOS

    - by yasirmturk
    i want to perform some DNS queries e.g. to get IP records against a specific domain name, i am looking for a preferred way or some useful snippet for this on iOS 3.2+ SDK. thanx in advance part from other snippets i found this code Boolean result; CFHostRef hostRef; NSArray *addresses; NSString *hostname = @"apple.com"; hostRef = CFHostCreateWithName(kCFAllocatorDefault, (CFStringRef)hostname); if (hostRef) { result = CFHostStartInfoResolution(hostRef, kCFHostAddresses, NULL); // pass an error instead of NULL here to find out why it failed if (result == TRUE) { addresses = (NSArray*)CFHostGetAddressing(hostRef, &result); } } if (result == TRUE) { [addresses enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) { NSString *strDNS = [NSString stringWithUTF8String:inet_ntoa(*((struct in_addr *)obj))]; NSLog(@"Resolved %d->%@", idx, strDNS); }]; } else { NSLog(@"Not resolved"); } but this is producing same IP for every host Resolved 0-220.120.64.1 any help??

    Read the article

  • Binding Data to Word 2007 Content Controls Using Visual Studio Tools for the Office System (3.0)

    - by Simon Lomax
    Hi, I found this article (http://msdn.microsoft.com/en-us/library/bb967663.aspx) and thought great thats exactly what I'm trying to do. I want to programatically build a product brochure using content controls and openXML. The article in question refers to an accompanying video which unfortunately does not appear to be available, nor does the code. I posted a comment to ask where they are but in the meantime does anybody know of a good example. There are plenty of examples of binding/merging one record into a openXML Word document. But I want to bind a whole list of records to create a product brochure. Can anyone point me to good tutorial? Thanks

    Read the article

  • Insert Statment with Case for avoid duplicate record insertion

    - by rama
    I have written the below SP for Precheck for Duplicate records before insert into Table . but it is not allow me yo write insert staement inside the CASE . how can I write Stored Procedure for fist Check the value @Ordername into table After that if it is not present then it should inserted into Database . CREATE PROCEDURE [Test Procedure ] ( @section varchar(70), @mark varchar(70), @qty decimal(18,2), @Weight decimal(18,2), @dateupdateremark int, @OrderName varchar(70) ) AS BEGIN SET NOCOUNT ON; select case(@OrderName) when (select OrderName from dbo.tbl_insertxmldetails where(@OrderName) not in (select OrderName from tbl_insertxmldetails)) then insert into dbo.tbl_insertxmldetails (Section, Mark, QTY,Weight,Dateupdateremark ,OrderName,SystemDate) values (@Section, @Mark, @QTY,@Weight, @Dateupdateremark,@OrderName,GETDATE()) else 'File already Exists' end

    Read the article

  • Best practice for avoiding locks on a heavily read table?

    - by Luiggi
    Hi, I have a big database (~4GB), with 2 large tables (~3M records) having ~180K SELECTs/hour, ~2k UPDATEs/hour and ~1k INSERTs+DELETEs/hour. What would be the best practice to guarantee no locks for the reading tasks while inserting/updating/deleting? I was thinking about using a NOLOCK hint, but there is so much discussed about this (is good, is bad, it depends) that I'm a bit lost. I must say I've tried this in a dev environment and I didn't find any problems, but I don't want to put it on production until I get some feedback... Thank you! Luiggi

    Read the article

  • Insert methode of TableAdapter not working?

    - by Stijn Leenknegt
    I'm using ADO.NET in my C# project. In my form I added a SourceBinding element from my toolbox in VS2010. I set the connection to the table of my dataset. It creates a DataAdapter automaticly for my. I want to insert a record, so I call the Insert() method of the DataAdapter. But when I view my database data, it doesn't have any new records... orderID = this.orderTableAdapter.Insert("", "", (int)OrderStatus.IN_CONSTRUCTION, DateTime.Now); Or do I need to insert it manually with the SqlCommand???

    Read the article

  • Searching the first few characters of every word within a string in C#

    - by user1704669
    I am new to Programming languages. I have a requirement where I have to return a record based on a search string. For e.g. I am having the following 3 records and my search string is 'Cal' 1)University of California 2)Pascal Institute 3)California University If I try string.Contains, all 3 are returned. If I try string.starts-with, I get only 3 but my requirement is I need #1 and #3 in the result. Thank you for your help. -Joel

    Read the article

  • laravel multiple where clauses within a loop

    - by user1424508
    Pretty much I want the query to select all records of users that are 25 years old AND are either between 150-170cm OR 190-200cm. I have this query written down below. However the problem is it keeps getting 25 year olds OR people who are 190-200cm instead of 25 year olds that are 150-170 OR 25 year olds that 190-200cm tall. How can I fix this? thanks $heightarray=array(array(150,170),array(190,200)); $user->where('age',25); for($i=0;$i<count($heightarray);i++){ if($i==0){ $user->whereBetween('height',$heightarray[$i]) }else{ $user->orWhereBetween('height',$heightarray[$i]) } } $user->get(); Edit: I tried advanced wheres (http://laravel.com/docs/queries#advanced-wheres) and it doesn't work for me as I cannot pass the $heightarray parameter into the closure. from laravel documentation DB::table('users') ->where('name', '=', 'John') ->orWhere(function($query) { $query->where('votes', '>', 100) ->where('title', '<>', 'Admin'); }) ->get();

    Read the article

  • SSIS Migration - Pulling IDs from dest DB?

    - by TheSciz
    So I'm working on migrating some data to a new server. In the new server, each entry in the MAIN table is assigned a new GUID when the transfer takes place. A few other tables must be migrated, and their records must link to the GUID in the MAIN table. Example... WorksheetID --- GUID 1245677903 --- 1 AccidentID --- WorksheetID --- Guid 12121412 --- 1245677903 --- 1 The guid is used moreso for versioning purposes, but my question is this. In SSIS, is there any way to pull the Worksheet's GUID from the destination database and assign it directly to the entries in the 'Accident' table? Or do I have to just dump the data into the source DB and run some scripts to get everything nicely referenced? Any help would be greatly appreciated.

    Read the article

  • How is Core Data detecting the conflicts, actually?

    - by brainfrog
    Apple says about -detectConflictsForObject: If on the next invocation of save: object has been modified in its persistent store, the save fails. This allows optimistic locking for unchanged objects. Conflict detection is always performed on changed or deleted objects. So what does this mean? If I simply modify an managed object and then save the context, there is always a conflict detection happening? Does this conflict detection simply compare the timestamps of the "records" to see if the "new" data is actually "old"? Is that a conflict?

    Read the article

  • Database Optimization techniques for amateurs.

    - by Zombies
    Can we get a list of basic optimization techniques going (anything from modeling to querying, creating indexes, views to query optimization). It would be nice to have a list of these, one technique per answer. As a hobbyist I would find this to be very useful, thanks. And for the sake of not being too vague, let's say we are using a maintstream DB such as MySQL or Oracle, and that the DB will contain 500,000-1m or so records across ~10 tables, some with foreign key contraints, all using the most typical storage engines (eg: InnoDB for MySQL). And of course, the basics such as PKs are defined as well as FK contraints.

    Read the article

  • Creating a relative path to a Database in Asp.net for a library

    - by Greener
    In school I am part of a team of four working to create a GUI to translate the paper records of a made-up company and their functionality to a digital format. We're using an ASP.NET website for this purpose. Basically we use stored procedures and C# classes to represent the database. The folder we're using for the project contains the site and the libraries in separate folders. If I try to open the site from the folder containing both these elements the site will not run. I want to know if there is some way I can set up a relative path to the database in the Settings.Settings.cs file (or by some other means) of my libraries so I don't have to constantly change the database location for the connection string value every time we move the project. I suppose I should also mention that the database is in an App_Data folder.

    Read the article

  • Is it possible to output other formats than .docx and .odt with TinyButStrong and OpenTPS plugin

    - by Corum
    I have a module which merge document from database records and .docx or .odt document model. I have to output .docx, .odt or .pdf. For outputing MS and Open format, there is no problem, all work properly. But what I want to know is, if I can output something (like XML or HTML) which I can use after to build a PDF document? If I can't, are there any libraries which provide merge document like : DOCX (or ODT) + database record => PDF And I don't want use phplivedocx.

    Read the article

  • vb.net checkboxes. Need to populate from database and also help in designing

    - by redr
    i have this requirement and since im new to vb.net dont really have much of idea how to do this. I have 20 checkboxes with dropdowns and textbox with it. the example is - table tr td checkbox -- textbox -- dropdownlist /td /tr tr td chk1 txtbox1 ddl1 /td /tr tr td chk2 txtbox2 ddl2 /td /tr and so on. the above structure shall be in one row of a table. does anyone know how to design this in code recursive and also how to take the checkbox data from here and send it to db table for records insert, update and select. thanks

    Read the article

  • Take advantage of multiple cores executing SQL statements

    - by willvv
    I have a small application that reads XML files and inserts the information on a SQL DB. There are ~ 300 000 files to import, each one with ~ 1000 records. I started the application on 20% of the files and it has been running for 18 hours now, I hope I can improve this time for the rest of the files. I'm not using a multi-thread approach, but since the computer I'm running the process on has 4 cores I was thinking on doing it to get some improvement on the performance (although I guess the main problem is the I/O and not only the processing). I was thinking on using the BeginExecutingNonQuery() method on the SqlCommand object I create for each insertion, but I don't know if I should limit the max amount of simultaneous threads (nor I know how to do it). What's your advice to get the best CPU utilization? Thanks

    Read the article

  • HOw do I delete record in a table by keeping certain datas??

    - by mathew
    my site has lots of incoming searches which is stored in a database to show recent queries into my website. due to high search queries my database is getting bigger in size. so what I want is I need to keep only recent queries in database say 10 records. this keeps my database small and queries will be faster. I am able to store incoming queries to database but don't know how to restrict or delete excess/old data from table. any help?? well I am using PHP and MySQL

    Read the article

  • NHibernate - Is ITransaction.Commit really necessary?

    - by user365383
    Hi I've just start studying NHibernate 2 days ago, and i'm looking for a CRUD method that i've writed based on an tutorial. My insert method is: using (ISession session = Contexto.OpenSession()) using (ITransaction transaction = session.BeginTransaction()) { session.Save(noticia); transaction.Commit(); session.Close(); } The complete code of "Contexto" is here: http://codepaste.net/mrnoo5 My question is: Do i really need to use ITransaction transaction = session.BeginTransaction() and transaction.Commit();? I'm asking this because i've tested run the web app without those two lines, and i've sucefully inserted new records. If possible, can someone explain me too the porpuse of Itransaction and the method Commit? Thanks

    Read the article

< Previous Page | 150 151 152 153 154 155 156 157 158 159 160 161  | Next Page >