Search Results

Search found 17140 results on 686 pages for 'records management'.

Page 218/686 | < Previous Page | 214 215 216 217 218 219 220 221 222 223 224 225  | Next Page >

  • Winforms checkbox Databinding problem

    - by Savvas Sopiadis
    Hello everybody! In a winforms application (VB, VS2008 SP1) i bound a checkbox field to a SQL Server 2005 BIT field. The databinding itself seems to work, there is this litte problem: user creates a new record and checks the checkbox, then the user decides to create a new record (without having saved the previous, so there are 2 new records to be submitted) and checks also the second. Now the user decides to save these records: the result is that only the second record keeps the checked value, the first one is unchecked! (i tried the same with 5 records: the result is the same, the first 4 records are unchecked and only the last one keeps the checked state). What do i miss?? Thanks in advance

    Read the article

  • Use a Cursor field in another method

    - by Mats Hofman
    Hi, In my app i have have a Cursor field and in the onStart() method of my Android Service I create it by fetching records from my database. When i look into my cursor in the onStart() method i find a number of records but when i try to use them in my trigger() method it has zero records. the field private Cursor c; in onStart() c = dbHelper.fetchAllRecords(); in trigger() c.getCount() returns null I didn't close the cursor earlier than in my onDestroy() method

    Read the article

  • Bulk Copy from one server to another

    - by Joseph
    Hi All, I've one situation where I need to copy part of the data from one server to another. The table schema are exactly same. I need to move partial data from the source, which may or may not be available in the destination table. The solution I'm thinking is, use bcp to export data to a text(or .dat) file and then take that file to the destination as both are not accessible at the same time (Different network), then import the data to the destination. There are some conditions I need to satisfy. 1. I need to export only a list of data from the table, not whole. My client is going to give me IDs which needs to be moved from source to destination. I've around 3000 records in the master table, and same in the child tables too. What I expect is, only 300 records to be moved. 2. If the record exists in the destination, the client is going to instruct as whether to ignore or overwrite case to case. 90% of the time, we need to ignore the records without overwriting, but log the records in a log file. Please help me with the best approach. I thought of using BCP with query option to filter the data, but while importing, how do I bypass inserting the existing records? If I need to overwrite, how to do it? Thanks a lot in advance. ~Joseph

    Read the article

  • PostgreSQL: Full Text Search - How to search partial words ?

    - by Anthoni Gardner
    Hello, Following a question posted here about how I can increase the speed on one of my SQL Search methods, I was advised to update my table to make use of Full Text Search. This is what I have now done, using Gist indexes to make searching faster. On some of the "plain" queries I have noticed a marked increase which I am very happy about. However, I am having difficulty in searching for partial words. For example I have several records that contain the word Squire (454) and I have several records that contain Squirrel (173). Now if I search for Squire it only returns the 454 records but I also want it to return the Squirrel records as well. My query looks like this SELECT title FROM movies WHERE vectors @@ to_tsoquery('squire'); I thought I could do to_tsquery('squire%') but that does not work. How do I get it to search for partial matches ? Also, in my database I have records that are movies and others that are just TV Shows. These are differentiated by the "" over the name, so like "Munsters" is a TV Show, whereas The Munsters is the film of the show. What I want to be able to do is search for just the TV Show AND just the movies. Any idea on how I can achieve this ? Regards Anthoni

    Read the article

  • DBCC SQL Server 2000 sysindexkeys

    - by Debasish Pramanik
    I have a SQL Server 2000 database. When I run the following command select * from sysindexkeys This display the appropriate records. I then do a DBCC command for the sysindexkeys. It doesn't display anything. Strange there is no page having the sysindexkeys records. Then how the query display the list of records.

    Read the article

  • After writing SQL statements in MySQL, how to measure the speed / performance of them?

    - by Jian Lin
    I saw something from an "execution plan" article: 10 rows fetched in 0.0003s (0.7344s) How come there are 2 durations shown? What if I don't have large data set yet. For example, if I have only 20, 50, or even just 100 records, I can't really measure how faster 2 different SQL statements compare in term of speed in real life situation? In other words, there needs to be at least hundreds of thousands of records, or even a million records to accurately compares the performance of 2 different SQL statements?

    Read the article

  • Duplicate entries on mysql on insert using doctrine

    - by Nikos Galis
    Hi all! I am facing a very weird problem with mysql and doctrine [with help of codeIgniter]. I am trying to make a simple migration script taking all records from one table and after a little process, saving them to another. However, on my laptop [running windows and wamp] I get double numbers of the original table records to have been copied to the destination table. In my colleagues' laptops, everything works fine! We are all using mysql 5.0.86 [plus windows plus wamp]. Here is the code : function buggy_function(){ $this->db(); //get db connection $q = Doctrine_Query::create()->from('Oldtable r'); $oldrecords = $q->fetchArray(); $count = 0; foreach ($oldrecords as $oldrecord){ $newrecord = new NewTableClass(); $newrecord->password = md5($oldrecord['password']); $newrecord->save(); echo $newrecord->id. ' Id -> saved.' } } Simple as that! I have 39 records on the Old table and I am getting 78 records in the new table, which are exactly the same records, except from the unique primary key. It seems as if the script runs twice. But the output of the script is the following : 1 Id -> saved. 2 Id -> saved. ... ... 39 Id -> saved. Do you have any idea why this is happening? Any known bug for mysql? Thank you in advanced!'

    Read the article

  • Using transactions with ADO.NET Data Adapters.

    - by Ergwun
    Scenario: I want to let multiple (2 to 20, probably) server applications use a single database using ADO.NET. I want individual applications to be able to take ownership of sets of records in the database, hold them in memory (for speed) in DataSets, respond to client requests on the data, perform updates, and prevent other applications from updating those records until ownership has been relinquished. I'm new to ADO.NET, but it seems like this should be possible using transactions with Data Adapters (ADO.NET disconnected layer). Question part 1: Is that the right way to try and do this? Question part 2: If that is the right way, can anyone point me at any tutorials or examples of this kind of approach (in C#)? Question part 3: If I want to be able to take ownership of individual records and release them independently, am I going to need a separate transaction for each record, and by extension a separate DataAdapter and DataSet to hold each record, or is there a better way to do that? Each application will likely hold ownership of thousands of records simultaneously.

    Read the article

  • Problem performance datawarehouse with lots of indexes

    - by Lieven Cardoen
    Our product takes tests of some 350 candidates at the same time. At the end of the test, results for each candidate are moved to a datawarehouse full of indexes on it. For each test there's some 400 records to be entered in datawarehouse. So 400 x 350 is a lot of records. If there are not much records in the datawarehouse, all goes well. But if there are already lots of records in the datawarehouse, then a lot of inserts fail... Is there a way to have indexes that are only rebuild at the end of the day or isn't that the real problem? Or how would you solve this?

    Read the article

  • Creating a "less"-like console pager interface for pysqlite3 database

    - by Eric
    I would like to add some interactive capability to a python CLI application I've writen that stores data in a SQLite3 database. Currently, my app reads-in a certain type of file, parses and analyzes, puts the analysis data into the db, and spits the formatted records to stdout (which I generally pipe to a file). There are on-the-order-of a million records in this file. Ideally, I would like to eliminate that text file situation altogether and just loop after that "parse and analyze" part, displaying a screen's worth of records, and allowing the user to page through them and enter some commands that will edit the records. The backend part I know how to do. Can anyone suggest a good starting point for creating that pager frontend either directly in the console (like the pager "less"), through ncurses, or some other system?

    Read the article

  • Can I output/flush data to screen while processing ajax page?

    - by Bee
    I need to display on my page a list of records pulled from a table. Ajax works fine (I query the database and put all the data inside a on the main page) but if I have lots of records (say 500+) it will hang until data is fully loaded, THEN it will be sent back to the page and correctly displayed. I would like to be able to display the records on the page while getting them, instead of being forced to wait until completion. I am trying with flush(); inside the remote (ajax) page but it still waits until full data is loaded. This is what I currently have inside the ajax page: At the very beginning: @apache_setenv('no-gzip', 1); @ini_set('zlib.output_compression', 0); @ini_set('implicit_flush', 1); for ($i = 0; $i < ob_get_level(); $i++) { ob_end_flush(); } ob_implicit_flush(1); Then whenever I have a echo call: ob_flush(); Now if I load the ajax page alone... it will list the records while reading them from the database. But if I call the same page via Ajax, it will hang and send all the data at once. Any idea? This is the function I use to get the ajax content ('id' is the target , 'url' refers to the ajax page that runs the database query to list the records): function ajax(id,url) { xmlhttp=new XMLHttpRequest(); xmlhttp.open("GET",url,false); xmlhttp.send(null); document.getElementById(id).innerHTML = parseScript(xmlhttp.responseText); }

    Read the article

  • Handle cases where Nhibernate subclass does not exist

    - by kaykayman
    I have a scenario where I am using nhibernate to map records from one table to several different derived classes based on a discriminator. public class BaseClass { } public class DerivedClass0 : BaseClass { } public class DerivedClass1 : BaseClass { } public class DerivedClass2 : BaseClass { } I then use nhibernate's DiscriminateSubClassesOnColumn() method and alter the configuration to include <subclass name="DerivedClass0" extends="BaseClass" discriminator-value="discriminator0" /> <subclass name="DerivedClass1" extends="BaseClass" discriminator-value="discriminator1" /> <subclass name="DerivedClass2" extends="BaseClass" discriminator-value="discriminator2" /> so that when mapped, these classes are cast to their derived classes and not BaseClass. However, there are some records in my database which have a discriminator which does not have a corresponding subclass. In these cases, nHibernate throws an error: "Object with id: 'xxx' was not of the specified subclass..." Is there some way I can handle this, so that any records which do not have a corresponding subclass are cast to BaseClass rather than an error being thrown? I have simplified the above as much as possible, however it is worth noting that the XML is edited dynamically which is why I am referencing fluent nhibernate [DiscriminateSubClassesOnColumn()] and XML at the same time. The following things (which would help) are not an option: I cannot correct the data to remove records which are invalid I cannot create subclasses for those records which do not have one I need to handle cases where nHibernate tries to map on a discriminator and finds that one does not exist.

    Read the article

  • extjs data store load data on fly

    - by CKeven
    I'm trying to create a data store that will load the data schema and records on fly. Here is the current code i have and I'm not sure how to setup the array reader properly since i don't have the schema before query returns. ds = new Ext.data.Store({ url: 'http://10.10.97.83/cgi-bin/cgiip.exe/WService=wsdev/majax/jsbrdgx.p', baseParams: { cr: Ext.util.JSON.encode(omgtobxParms) }, reader: new Ext.data.ArrayReader({ //root:data.value.records }, col_names) }); {"name": "tmp_buy_book", "schema": [ { "name": "a", "type": "C"}, { "name": "b", "type": "C"} "records": [["1", ""], ["1",""]]}

    Read the article

  • Non distinct Unique ID in MySQL database table.

    - by Geoff
    First of, a simplified version: I am wondering if I can create a trigger to activate during INSERT (it's actually LOAD DATA INFILE) and NOT enter records for an RMA already in my table? I have a table that has no records that are unique. Some may be duplicates but there is one field that I can use to know if the data has been entered or not. For instance RMA Op Days --------------------- 213 Repair 0.10 213 Test 0.20 213 Repair 0.10 So I could do an index on the three columns together but as you see it's possible for an RMA to be in a step for the same amount of time twice so it's possible to have duplicate records. This data comes from a report that I cannot edit and this is all it provides. The key is that an RMA's data is only in the report once so if my database already has that RMA in it's records I want to skip the loading of that RMA's records from the report. By all means please let me know if that didn't make sense, I'll Explain as needed. I'm sure it's not uncommon but I couldn't find anything on the net.

    Read the article

  • Does a transaction stop all race condition problems in MySQL?

    - by nickf
    Consider this situation: Begin transaction Insert 20 records into a table with an auto_increment key Get the first insert id (let's say it's 153) Update all records in that table where id >= 153 Commit Is step 4 safe? That is, if another request comes in almost precisely at the same time, and inserts another 20 records after step 2 above, but before step 4, will there be a race condition?

    Read the article

  • T-SQL Picking up active IDs from a comma seperated IDs list

    - by hammayo
    I have two tables "Product" having following structure: ProductID,ProductName, IsSaleTypeA, IsSaleTypeB, IsSaleTypeC 1, AAA, N, N, N 2, BBB, N, Y, N -- active 3, CCC, N, N, N 4, DDD, Y, N, N -- active 5, EEE, N, N, N 6, FFF, N, N, N 7, FFE, N, N, N 8, GGG, N, N, N 9, HHH, Y, N, N -- active The second table "ProductAllowed" having following structure where ProductIDs is a comma separated string filed having mix of active and inactive product ids based on their IsSaleType mode. ProductCode, ProductIDs AMRLSPN, "1,2" AMRLOFD, "1,3" BLGHVF, "2,4,6" BLGHVO, "2,4" BLGHVD, "3,5" BLGSDO, "0" CHOHVF, "1,6" CHOHVP, "1,2,7,8" ... ... Q: Is there a t-sql query that will return a list of active records from the "ProductAllowed" table if any of three IsSaleType fileds is/are switched on for a product. Based on the sample data the ProductAllowed records should return following records. AMRLSPN BLGHVF BLGHVO BLGSDO CHOHVP This needs to be applied in a SQLSERVER 2000 database containing aprox 150000 records. Thanks

    Read the article

  • php page navigation by serial number

    - by ilnur777
    Can anyone help in this php page navigation script switch on counting normal serial number? In this script there is a var called "page_id" - I want this var to store the real page link by order like 0, 1, 2, 3, 4, 5 ... $records = 34; // total records $pagerecord = 10; // count records to display per page if($records<=$pagerecord) return; $imax = (int)($records/$pagerecord); if ($records%$pagerecord>0)$imax=$imax+1; if($activepage == ''){ $for_start=$imax; $activepage = $imax-1; } $next = $activepage - 1; if ($next<0){$next=0;} $prev = $activepage + 1; if ($prev>=$imax){$prev=$imax-1;} $end = 0; $start = $imax; if($activepage >= 0){ $for_start = $activepage + $rad + 1; if($for_start<$rad*2+1)$for_start = $rad*2+1; if($for_start>=$imax){ $for_start=$imax; } } if($activepage < $imax-1){ $str .= ' <a href="?domain='.$domain_name.'&page='.($start-1).'&page_id=xxx"><<< End</a> <a href="?domain='.$domain_name.'&page='.$prev.'&page_id=xxx">< Forward</a> '; } $meter = $rad*2+1; for($i=$for_start-1; $i>-1; $i--){ $meter--; $line = ''; if ($i>0)$line = ""; if($i<>$activepage){ $str .= "<a href='?domain=".$domain_name."&page=".$i."&page_id=xxx'>".($i)."</a> ".$line." "; } else { $str .= " <b class='current_page'>".($i)."</b> ".$line." "; } if($meter=='0'){ break; } } if($activepage > 0){ $str .= " <a href='?domain=".$domain_name."&page=".$next."&page_id=xxx'>Back ></a> <a href='?domain=".$domain_name."&page=".($end)."&page_id=xxx'>Start >>></a> "; } return $str; Really need help with this stuff! Thanks in advance!

    Read the article

  • Should I expect Comet to be this slow?

    - by Chad Johnson
    I have the following in a Rails controller: def poll records = [] start_time = Time.now.to_i while records.length == 0 do records = Something.uncached{Something.find(:all, :conditions => { :some_condition => false})} if records.length > 0 break end sleep 1 if Time.now.to_i - start_time >= 20 break end end responseData = [] records.each do |record| responseData << { 'something' => record.some_value } # Flag message as received. record.some_condition = true record.save end render :text => responseData.to_json end and then I have Javascript performing an AJAX request. The request sits there for 20 seconds or until the controller method finds a record in the database, waiting. That works. function poll() { $.ajax({ url: '/my_controller/poll', type: 'GET', dataType: 'json', cache: false, data: 'time=' + new Date().getTime(), success: function(response) { // show response here }, complete: function() { poll(); }, error: function() { alert('error'); poll(); } }); } When I have 5 - 10 tabs open in my browser, my web application becomes super slow. Is this to be expected? Or is there some obvious improvement(s) I can make?

    Read the article

  • Thread vs ThreadPool - .Net 2.0

    - by NLV
    Hello I'm not able to understand the difference between Thread vs ThreadPool. Consider i've to manipulate 50,000 records using threads. In case of threads i need to either predefine no of threads or no of records per threads. Either of them has to be constant. In case of threadpool we dont need to set any of them theoretically. But practically we need to assign the number of records per thread, because the no of threads may grow extremely large if the input no of records is huge. Any insights on this?

    Read the article

  • Sorting the data returned by a database

    - by Rishabh Ohri
    hi all, In our project we have a requirement that when a set of records are returned by the database the records should be sorted with respect to the TITLE field in the record. The records will have to be sorted alphabetically but if the title of a record has a number in it then it should come after the records whose title only consists of alphabets. Details: we are using SQL Server , and c#. The data from the database comes to an Entity class whic forwards the data to other layers. So, What will be the possible and effective solution for this requirement.

    Read the article

  • SSIS - How do I use a resultset as input in a SQL task and get data types right?

    - by thursdaysgeek
    I am trying to merge records from an Oracle database table to my local SQL table. I have a variable for the package that is an Object, called OWell. I have a data flow task that gets the Oracle data as a SQL statment (select well_id, well_name from OWell order by Well_ID), and then a conversion task to convert well_id from a DT_STR of length 15 to a DT_WSTR; and convert well_name from a DT_STR of length 15 to DT_WSTR of length 50. That is then stored in the recordset OWell. The reason for the conversions is the table that I want to add records to has an identity field: SSIS shows well_id as a DT_WSTR of length 15, well_name a DT_WSTR of length 50. I then have a SQL task that connects to the local database and attempts to add records that are not there yet. I've tried various things: using the OWell as a result set and referring to it in my SQL statement. Currently, I have the ResultSet set to None, and the following SQL statment: Insert into WELL (WELL_ID, WELL_NAME) Select OWELL_ID, OWELL_NAME from OWell where OWELL_ID not in (select WELL.WELL_ID from WELL) For Parameter Mapping, I have Paramater 0, called OWell_ID, from my variable User::OWell. Parameter 1, called OWell_Name is from the same variable. Both are set to VARCHAR, although I've also tried NVARCHAR. I do not have a Result set. I am getting the following error: Error: 0xC002F210 at Insert records to FLEDG, Execute SQL Task: Executing the query "Insert into WELL (WELL_ID, WELL_NAME) Select OWELL..." failed with the following error: "An error occurred while extracting the result into a variable of type (DBTYPE_STR)". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. I don't think it's a data type issue, but rather that I somehow am not using the resultset properly. How, exactly, am I supposed to refer to that recordset in my SQL task, so that I can use the two recordset fields and add records that are missing?

    Read the article

  • PHP edit unique row in table

    - by Robert
    I currently have a PHP form that uses AJAX to connect to MySQL and display records matching a user's selection (http://stackoverflow.com/questions/2593317/ajax-display-mysql-data-with-value-from-multiple-select-boxes) As well as displaying the data, I also place an 'Edit' button next to each result which displays a form where the data can be edited. My problem is editing unique records since currently I only use the selected values for 'name' and 'age' to find the record. If two (or more) records share the same name and age, I am only able to edit the first result.

    Read the article

  • Looping through siblings of a specific row/setting up function

    - by Matt
    Trying to work through my javascript book I am referencing to learn the language and got stuck on looping through siblings of a specific row. W3schools and W3 didnt have what i was looking for. Below is a function walk-through... It reads: Create the countRecords() function. The purpose of this function is to count the number of visible rows in the data table after the table headings. The total is then displayed in the table cell with the id "records". Add the follow commands to the function: a. Create a object named headRow that points to the table row with the id "titleRow". Create a variable named rowCount, setting its initial value to 0. b. Create a for loop that uses familial references starting with the first sibling of headRow and moving to the next sibling until there are no siblings left. Within the for loop. test whether the node name of the currentnext sibling until there are no sibilings left. Within the for loop test whether the node name of the current node is equal to "TR". If it is, test wheter the value of its display style is equal to an empty text string. If it is (indicating that it is visible in the document) increate the value of the rowCount variable by 1. c. Change the text of the "records" table cell to the value of the rowCount variable. Don't use innerHTML. Create a text node that contains the value of the rowCount variable and assign it to a variable called txt. Create a variable called record to store the reference to the element "records" table cell. d. Insert an if condition that test whether the "records" cell has any child nodes. If it does, replace the replace the text node of the "record" table cell with the created text node (txt). If it doesn't append the text node to the cell. var headRow; // part a var rowCount = 0; //part b this is where I get lost. I know I need to access the id titleRow but unsure how to set my loop up specifically for this headRow = document.getElementById("titleRow"); for(var i=0; i<headrow.length; i++) { if (something is not equal == "TH") { make code happen here } if (is "TR" == ""){ rowCount = +1; } //part c var txt = document.createTextNode(rowCount); var record = document.getElementsById("records") //part d holding off on this part until I get a,b,c figured out. The HTML supporting snippet: <table id="filters"> <tr><th colspan="2">Filter Product List</th></tr> <tr> <td>Records: </td> <td id="records"></td> </tr> <table id="prodTable"> <tr><th colspan="8">Digital Cameras</th></tr> <tr id="titleRow"> <th>Model</th> <th>Manufacturer</th> <th>Resolution</th> <th>Zoom</th> <th>Media</th> <th>Video</th> <th>Microphone</th> </tr> Thanks for the help!

    Read the article

  • How to combine Translate and Soft Deletable Behavior in CakePHP 1.2.7?

    - by m99
    Hi guys, i'm trying to combine Translate Behavior and Mariano Iglesias' Soft Deletable Behavior Revision 49. But always if a want to soft delete a record which hasMany other records, which are translated (located in the i18n-table partly), the related hasMany records aren't soft deleted. Example: Post hasMany Comments (dependent = true) Post actsAs SoftDeletable Comments actsAs SoftDeletable, Translate Post record get's soft deleted, but I also get an error for the dependent Comment records and they aren't soft deleted. Any suggestions? Thanks in advance Marco

    Read the article

< Previous Page | 214 215 216 217 218 219 220 221 222 223 224 225  | Next Page >