Search Results

Search found 7311 results on 293 pages for 'rows'.

Page 95/293 | < Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >

  • Mat matrix multiplication, openCV?

    - by facebook-1593205594
    I initialized two Mat images as: Mat ft=Mat::zeros(src.rows,src.cols,CV_32FC1),h=Mat::zeros(src.rows,src.cols,CV_32FC1); and then i have some calculations: ft has fourier transform stored for an image, and h has matrix for Laplacian filtering in fourier domain.......they both have same dimensions, and then i did multiplication of them using both h*ft and gemm(h,ft,1,NULL,0,temp); function call but while executing it shows some problems..... it reads like this: opencv error assertion failed (some long code and at last says something about gemm in ....matmul.cpp)......termination called after throwing exception of 'cv::exception'

    Read the article

  • How Do I Prevent Rails From Treating Updated Nested Attributes Differently From New Nested Attribute

    - by James
    I am using rails3 beta3 and couchdb via couchrest. I am not using active record. I want to add multiple "Sections" to a "Guide" and add and remove sections dynamically via a little javascript. I have looked at all the screencasts by Ryan Bates and they have helped immensely. The only difference is that I want to save all the sections as an array of sections instead of individual sections. Basically like this: "sections" => [{"title" => "Foo1", "content" => "Bar1"}, {"title" => "Foo2", "content" => "Bar2"}] So, basically I need the params hash to look like that when the form is submitted. When I create my form I am doing the following: <%= form_for @guide, :url => { :action => "create" } do |f| %> <%= render :partial => 'section', :collection => @guide.sections %> <%= f.submit "Save" %> <% end %> And my section partial looks like this: <%= fields_for "sections[]", section do |guide_section_form| %> <%= guide_section_form.text_field :section_title %> <%= guide_section_form.text_area :content, :rows => 3 %> <% end %> Ok, so when I create the guide with sections, it is working perfectly as I would like. The params hash is giving me a sections array just like I would want. The problem comes when I want edit guide/sections and save them again because rails is inserting the id of the guide in the id and name of each form field, which is screwing up the params hash on form submission. Just to be clear, here is the raw form output for a new resource: <input type="text" size="30" name="sections[][section_title]" id="sections__section_title"> <textarea rows="3" name="sections[][content]" id="sections__content" cols="40"></textarea> And here is what it looks like when editing an existing resource: <input type="text" value="Foo1" size="30" name="sections[cd2f2759895b5ae6cb7946def0b321f1][section_title]" id="sections_cd2f2759895b5ae6cb7946def0b321f1_section_title"> <textarea rows="3" name="sections[cd2f2759895b5ae6cb7946def0b321f1][content]" id="sections_cd2f2759895b5ae6cb7946def0b321f1_content" cols="40">Bar1</textarea> How do I force rails to always use the new resource behavior and not automatically add the id to the name and value. Do I have to create a custom form builder? Is there some other trick I can do to prevent rails from putting the id of the guide in there? I have tried a bunch of stuff and nothing is working. Thanks in advance!

    Read the article

  • First Tap on customcell of uitableview should expand it and second should contract it.

    - by neha
    Hi all, In my application I have this requirement that first tap on custom cell of uitableview with a label in it should expand it and second should contract it. I'm able to expand and contract cell and expand label inside cell, but not able to contract the label on second tap. I'm using this function - (void)setSelected:(BOOL)selected animated:(BOOL)animated { [super setSelected:selected animated:animated]; if( selected == YES ) { [self expandRow]; } else { [self contractRow]; } height = [lblFeed frame].size.height + 75; } expandRow expands the label and contractRow contracts it. I'm perplexed as for how many rows this function gets called. It doesn't get called only for the cell tapped, it gets called more number of times for single tap on single cell may be for other cells but I'm not getting which rows. This' really urgent. Can anybody please help?

    Read the article

  • Is DataRow thread safe? How to update a single datarow in a datatable using multiple threads? - .net

    - by NLV
    Hello all I want to update a single datarow in a datatable using multiple threads. Is this actually possible? I've written the following code implementing a simple multi-threading to update a single datarow. I get different results each time. Why is it so? public partial class Form1 : Form { private static DataTable dtMain; private static string threadMsg = string.Empty; public Form1() { InitializeComponent(); } private void Form1_Load(object sender, EventArgs e) { Thread[] thArr = new Thread[5]; dtMain = new DataTable(); dtMain.Columns.Add("SNo"); DataRow dRow; dRow = dtMain.NewRow(); dRow["SNo"] = 5; dtMain.Rows.Add(dRow); dtMain.AcceptChanges(); ThreadStart ts = new ThreadStart(delegate { dtUpdate(); }); thArr[0] = new Thread(ts); thArr[1] = new Thread(ts); thArr[2] = new Thread(ts); thArr[3] = new Thread(ts); thArr[4] = new Thread(ts); thArr[0].Start(); thArr[1].Start(); thArr[2].Start(); thArr[3].Start(); thArr[4].Start(); while (!WaitTillAllThreadsStopped(thArr)) { Thread.Sleep(500); } foreach (Thread thread in thArr) { if (thread != null && thread.IsAlive) { thread.Abort(); } } dgvMain.DataSource = dtMain; } private void dtUpdate() { for (int i = 0; i < 1000; i++) { try { dtMain.Rows[0][0] = Convert.ToInt32(dtMain.Rows[0][0]) + 1; dtMain.AcceptChanges(); } catch { continue; } } } private bool WaitTillAllThreadsStopped(Thread[] threads) { foreach (Thread thread in threads) { if (thread != null && thread.ThreadState == ThreadState.Running) { return false; } } return true; } } Any thoughts on this? Thank you NLV

    Read the article

  • Query results taking too long on 200K database, speed up tips?

    - by colorfulgrayscale
    I have a sql statement where I'm joining about 4 tables, each with 200K rows. The query runs, but keeps freezing. When I do a join on 3 tables instead, it returns the rows (takes about 10secs). Any suggestion why? suggestions to speed up? Thanks! Code SELECT * FROM equipment, tiremap, workreference, tirework WHERE equipment.tiremap = tiremap.`TireID` AND tiremap.`WorkMap` = workreference.`aMap` AND workreference.`bMap` = tirework.workmap LIMIT 5 p.s and if it helps any, I'm using sql alchemy to generate this code, the sqlalchemy code for this is query = session.query(equipment, tiremap, workreference, tirework) query = query.filter(equipment.c.tiremap == tiremap.c.TireID) query = query.filter(tiremap.c.WorkMap==workreference.c.aMap) query = query.filter(workreference.c.bMap == tirework.c.workmap) query = query.limit(5) query.all()

    Read the article

  • Efficient Search function with Linq to SQL

    - by Bayonian
    Hi, I'm using VB.NET and Linq to SQL. I have a table with thousands of rows and growing. Right now I'm using .Contains() in the Where clause to perform the query. Below is my search function : Public Shared Function DemoSearchFunction(ByVal keyword As String) As DataTable Dim db As New BibleDataClassesDataContext() Dim query = From b In db.khmer_books _ From ch In db.khmer_chapters _ From v In db.testing_khmers _ Where v.t_v.Contains(keyword) And ch.kh_book_id = b.kh_b_id And v.t_chid = ch.kh_ch_id _ Select b.kh_b_id, b.kh_b_title, ch.kh_ch_id, ch.kh_ch_number, v.t_id, v.t_vn, v.t_v Dim dtDataTableOne = New DataTable("dtOne") dtDataTableOne.Columns.Add("bid", GetType(Integer)) dtDataTableOne.Columns.Add("btitle", GetType(String)) dtDataTableOne.Columns.Add("chid", GetType(Integer)) dtDataTableOne.Columns.Add("chn", GetType(Integer)) dtDataTableOne.Columns.Add("vid", GetType(Integer)) dtDataTableOne.Columns.Add("vn", GetType(Integer)) dtDataTableOne.Columns.Add("verse", GetType(String)) For Each r In query dtDataTableOne.Rows.Add(New Object() {r.kh_b_id, r.kh_b_title, r.kh_ch_id, r.kh_ch_number, r.t_id, r.t_vn, r.t_v}) Next Return dtDataTableOne End Function I would like to know other methods for doing efficient search using Linq to SQL. Thanks.

    Read the article

  • UNIX Timestamp to MySQL DATETIME

    - by Henk Denneboom
    Hi all, I have a table with statistics and a field named time with Unix Timestamps. There are about 200 rows in the table, but I would like to change the Unix timestamps to MySQL DATETIME. Without losing the current rows. What would be the best way to update the Unix Timestamp to MySQL's DATETIME? The current table: CREATE TABLE `stats` ( `id` int(11) unsigned NOT NULL auto_increment, `time` int(11) NOT NULL, `domain` varchar(40) NOT NULL, `ip` varchar(20) NOT NULL, `user_agent` varchar(255) NOT NULL, `domain_id` int(11) NOT NULL, PRIMARY KEY (`id`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8 So the time (INT) should be a DATETIME field. Thanks in advance!

    Read the article

  • java jama array problem

    - by agazerboy
    Hi All, I asked a question before but duffymo said it is not clear so i am going to post it again here. I am using Jama api for SVD calculation. I know very well about jama and SVD. Jama does not work if your column are more than rows. I have this situation. What should I do?? any help? I can't transpose the matrix too as it can produce wrong results. Thanks. P.S: I am calculating LSI with the help of jama. I am going like column(docs) and rows ( terms )

    Read the article

  • Zipping Rx IObservable with infinite number set

    - by Toni Kielo
    I have a IObservable [named rows in the sample below] from Reactive extensions framework and I want to add index numbers to each object it observes. I've tried to implement this using Zip function: rows.Zip(Enumerable.Range(1, int.MaxValue), (row, index) => new { Row = row, Index = index }) .Subscribe(a => ProcessRow(a.Row, a.Index), () => Completed()); .. but unfortunately this throws ArgumentOutOfRangeException: Specified argument was out of the range of valid values. Parameter name: disposables Am I understanding the Zip function wrong or is there a problem with my code? The Range part of the code doesn't seem to be the problem and the IObservable isn't yet receiving any events.

    Read the article

  • Horizontal scroll winforms listview

    - by tempy
    Anyone know if its possible to enable horizontal scrolling ONLY in a windows forms listview (viewmode set to large icons). What I want to do is make a listview whose height is sufficient to only show one row of icons, and I don't want to have multiple rows. Just one very long row that a user would have to scroll horizontally to get to out-of-range icons. If I make the listview scrollable then it automatically makes multiple rows and puts in a vertical scrollbar, which I don't want. Thanks in advance!

    Read the article

  • Linq Query Help Needed

    - by Randy Minder
    Say I have the following LINQ queries: var source = from workflow in sourceWorkflowList select new { SubID = workflow.SubID, ReadTime = workflow.ReadTime, ProcessID = workflow.ProcessID, LineID = workflow.LineID }; var target = from workflow in targetWorkflowList select new { SubID = workflow.SubID, ReadTime = workflow.ReadTime, ProcessID = workflow.ProcessID, LineID = workflow.LineID }; var difference = source.Except(target); sourceWorkflowList and targetWorkflowList have the exact same column definitions. But they both contain more columns of data than what is shown in the queries above. Those are just the columns needed for this particular issue. difference contains all rows in sourceWorkflowList that are not contained in targetWorkflowList Now what I would like to do is to remove all rows from sourceWorkflowList that do not exist in difference. Could someone show me a query that would do this? Thanks very much - Randy

    Read the article

  • Why does DataSet.HasChanges() return true if called in the same function as change but false otherwi

    - by Chapso
    Assuming that the buttons are being clicked in order, the following code modifies row 1 in a .net DataSet, then row 0. HasChanges returns true, but GetChanges only gives me the change for row 0. If I comment out the line in the button2 click event that modifies row 0, then HasChanges returns false. Can anyone explain to me why HasChanges() and GetChanges() only return true when they are called in the same function as the change to the DataSet? Protected Sub Button1_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button1.Click __ds1.Tables(0).Rows(1)("EmployeeName") = "TestStuff" End Sub Protected Sub Button2_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button2.Click __ds1.Tables(0).Rows(0)("EmployeeName") = "TestStuff" If __ds1.HasChanges(DataRowState.Modified Or DataRowState.Added) Then __ds2 = __ds1.GetChanges(DataRowState.Modified _ Or DataRowState.Added) GridView3.DataSource = __ds2.Tables("Users").DefaultView GridView3.DataBind() Else End If End Sub

    Read the article

  • What's the difference between !col and col=false in MySQL?

    - by Mask
    The two statements have totally different performance: mysql> explain select * from jobs where createIndexed=false; +----+-------------+-------+------+----------------------+----------------------+---------+-------+------+-------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+-------+------+----------------------+----------------------+---------+-------+------+-------+ | 1 | SIMPLE | jobs | ref | i_jobs_createIndexed | i_jobs_createIndexed | 1 | const | 1 | | +----+-------------+-------+------+----------------------+----------------------+---------+-------+------+-------+ 1 row in set (0.01 sec) mysql> explain select * from jobs where !createIndexed; +----+-------------+-------+------+---------------+------+---------+------+-------+-------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+-------+------+---------------+------+---------+------+-------+-------------+ | 1 | SIMPLE | jobs | ALL | NULL | NULL | NULL | NULL | 17996 | Using where | +----+-------------+-------+------+---------------+------+---------+------+-------+-------------+ Column definition and related index for aiding analysis: createIndexed tinyint(1) NOT NULL DEFAULT 0, create index i_jobs_createIndexed on jobs(createIndexed);

    Read the article

  • MySQL index cardinality - performance vs storage efficiency

    - by Sean
    Say you have a MySQL 5.0 MyISAM table with 100 million rows, with one index (other than primary key) on two integer columns. From my admittedly poor understanding of B-tree structure, I believe that a lower cardinality means the storage efficiency of the index is better, because there are less parent nodes. Whereas a higher cardinality means less efficient storage, but faster read performance, because it has to navigate through less branches to get to whatever data it is looking for to narrow down the rows for the query. (Note - by "low" vs "high", I don't mean e.g. 1 million vs 99 million for a 100 million row table. I mean more like 90 million vs 95 million) Is my understanding correct? Related question - How does cardinality affect write performance?

    Read the article

  • asp.net InsertCommand to return latest insert ID

    - by Stijn Van Loo
    Dear all, I'm unable to retrieve the latest inserted id from my SQL Server 2000 db using a typed dataset in asp.NET I have created a tableadapter and I ticked the "Refresh datatable" and "Generate Insert, Update and Delete statements". This auto-generates the Fill and GetData methods, and the Insert, Update, Select and Delete statements. I have tried every possible solution in this thread http://forums.asp.net/t/990365.aspx but I'm still unsuccesfull, it always returns 1(=number of affected rows). I do not want to create a seperate insert method as the auto-generated insertCommand perfectly suits my needs. As suggested in the thread above, I have tried to update the InsertCommand SQL syntax to add SELECT SCOPY_IDENTITY() or something similar, I have tried to add a parameter of type ReturnValue, but all I get is the number of affected rows. Does anyone has a different take on this? Thanks in advance! Stijn

    Read the article

  • Use ASP.Net server control code generation from .ashx

    - by Matt Dawdy
    I'm trying to take an existing bunch of code that was previously on a full .aspx page, and do the same stuff in a .ashx handler. The code created an HtmlTable object, added rows and cells to those rows, then added that html table the .aspx's controls collection, then added it to a div that was already on the page. I am trying to keep the code in tact but instead of putting the control into a div, actually generate the html and I'll return that in a big chunk of text that can be called via AJAX client-side. HtmlTable errors out when I try to use the InnerHtml property (says it isn't supported), and when I try RenderControl, after making first a TextWriter and next an HtmlTextWriter object, I get the error that Page cannot be null. Has anyone done this before? Any suggestions?

    Read the article

  • speed up a sql query to mysql?

    - by fayer
    in my mysql database i've got the geonames database, containing all countries, states and cities. i am using this to create a cascading menu so the user could select where he is from: country - state - county - city. but the main problem is that the query will search through all the 7 millions rows in that table each time i want to get the list of children rows, and that is taking a while 10-15 seconds. i wonder how i could speed this up: caching? table views? reorganizing table structure somehow? and most important, how do i do these things? are there good tutorials you could link to me? i appreciate all help and feedback discussing smart ways of handling this issue!

    Read the article

  • How do I create an empty custom table in Wix

    - by Samuel Jack
    How do I get Wix to include a CustomTable with no rows in the final MSI? If I simply define the table like this <CustomTable Id="MyTable"> <Column Id="Id" Type="string" Category="Identifier" PrimaryKey="yes"/> <Column Id="Root" Type="string"/> <Column Id="Key" Type="string"/> <Column Id="Name" Type="string"/> </CustomTable> Wix omits it from the final output. My CustomAction is expecting it to be there, so that it can add rows to it during execution. Any ideas?

    Read the article

  • Why is Oracle using a skip scan for this query?

    - by Jason Baker
    Here's the tkprof output for a query that's running extremely slowly (WARNING: it's long :-) ): SELECT mbr_comment_idn, mbr_crt_dt, mbr_data_source, mbr_dol_bl_rmo_ind, mbr_dxcg_ctl_member, mbr_employment_start_dt, mbr_employment_term_dt, mbr_entity_active, mbr_ethnicity_idn, mbr_general_health_status_code, mbr_hand_dominant_code, mbr_hgt_feet, mbr_hgt_inches, mbr_highest_edu_level, mbr_insd_addr_idn, mbr_insd_alt_id, mbr_insd_name, mbr_insd_ssn_tin, mbr_is_smoker, mbr_is_vip, mbr_lmbr_first_name, mbr_lmbr_last_name, mbr_marital_status_cd, mbr_mbr_birth_dt, mbr_mbr_death_dt, mbr_mbr_expired, mbr_mbr_first_name, mbr_mbr_gender_cd, mbr_mbr_idn, mbr_mbr_ins_type, mbr_mbr_isreadonly, mbr_mbr_last_name, mbr_mbr_middle_name, mbr_mbr_name, mbr_mbr_status_idn, mbr_mpi_id, mbr_preferred_am_pm, mbr_preferred_time, mbr_prv_innetwork, mbr_rep_addr_idn, mbr_rep_name, mbr_rp_mbr_id, mbr_same_mbr_ins, mbr_special_needs_cd, mbr_timezone, mbr_upd_dt, mbr_user_idn, mbr_wgt, mbr_work_status_idn FROM (SELECT /*+ FIRST_ROWS(1) */ mbr_comment_idn, mbr_crt_dt, mbr_data_source, mbr_dol_bl_rmo_ind, mbr_dxcg_ctl_member, mbr_employment_start_dt, mbr_employment_term_dt, mbr_entity_active, mbr_ethnicity_idn, mbr_general_health_status_code, mbr_hand_dominant_code, mbr_hgt_feet, mbr_hgt_inches, mbr_highest_edu_level, mbr_insd_addr_idn, mbr_insd_alt_id, mbr_insd_name, mbr_insd_ssn_tin, mbr_is_smoker, mbr_is_vip, mbr_lmbr_first_name, mbr_lmbr_last_name, mbr_marital_status_cd, mbr_mbr_birth_dt, mbr_mbr_death_dt, mbr_mbr_expired, mbr_mbr_first_name, mbr_mbr_gender_cd, mbr_mbr_idn, mbr_mbr_ins_type, mbr_mbr_isreadonly, mbr_mbr_last_name, mbr_mbr_middle_name, mbr_mbr_name, mbr_mbr_status_idn, mbr_mpi_id, mbr_preferred_am_pm, mbr_preferred_time, mbr_prv_innetwork, mbr_rep_addr_idn, mbr_rep_name, mbr_rp_mbr_id, mbr_same_mbr_ins, mbr_special_needs_cd, mbr_timezone, mbr_upd_dt, mbr_user_idn, mbr_wgt, mbr_work_status_idn, ROWNUM AS ora_rn FROM (SELECT mbr.comment_idn AS mbr_comment_idn, mbr.crt_dt AS mbr_crt_dt, mbr.data_source AS mbr_data_source, mbr.dol_bl_rmo_ind AS mbr_dol_bl_rmo_ind, mbr.dxcg_ctl_member AS mbr_dxcg_ctl_member, mbr.employment_start_dt AS mbr_employment_start_dt, mbr.employment_term_dt AS mbr_employment_term_dt, mbr.entity_active AS mbr_entity_active, mbr.ethnicity_idn AS mbr_ethnicity_idn, mbr.general_health_status_code AS mbr_general_health_status_code, mbr.hand_dominant_code AS mbr_hand_dominant_code, mbr.hgt_feet AS mbr_hgt_feet, mbr.hgt_inches AS mbr_hgt_inches, mbr.highest_edu_level AS mbr_highest_edu_level, mbr.insd_addr_idn AS mbr_insd_addr_idn, mbr.insd_alt_id AS mbr_insd_alt_id, mbr.insd_name AS mbr_insd_name, mbr.insd_ssn_tin AS mbr_insd_ssn_tin, mbr.is_smoker AS mbr_is_smoker, mbr.is_vip AS mbr_is_vip, mbr.lmbr_first_name AS mbr_lmbr_first_name, mbr.lmbr_last_name AS mbr_lmbr_last_name, mbr.marital_status_cd AS mbr_marital_status_cd, mbr.mbr_birth_dt AS mbr_mbr_birth_dt, mbr.mbr_death_dt AS mbr_mbr_death_dt, mbr.mbr_expired AS mbr_mbr_expired, mbr.mbr_first_name AS mbr_mbr_first_name, mbr.mbr_gender_cd AS mbr_mbr_gender_cd, mbr.mbr_idn AS mbr_mbr_idn, mbr.mbr_ins_type AS mbr_mbr_ins_type, mbr.mbr_isreadonly AS mbr_mbr_isreadonly, mbr.mbr_last_name AS mbr_mbr_last_name, mbr.mbr_middle_name AS mbr_mbr_middle_name, mbr.mbr_name AS mbr_mbr_name, mbr.mbr_status_idn AS mbr_mbr_status_idn, mbr.mpi_id AS mbr_mpi_id, mbr.preferred_am_pm AS mbr_preferred_am_pm, mbr.preferred_time AS mbr_preferred_time, mbr.prv_innetwork AS mbr_prv_innetwork, mbr.rep_addr_idn AS mbr_rep_addr_idn, mbr.rep_name AS mbr_rep_name, mbr.rp_mbr_id AS mbr_rp_mbr_id, mbr.same_mbr_ins AS mbr_same_mbr_ins, mbr.special_needs_cd AS mbr_special_needs_cd, mbr.timezone AS mbr_timezone, mbr.upd_dt AS mbr_upd_dt, mbr.user_idn AS mbr_user_idn, mbr.wgt AS mbr_wgt, mbr.work_status_idn AS mbr_work_status_idn FROM mbr JOIN mbr_identfn ON mbr.mbr_idn = mbr_identfn.mbr_idn WHERE mbr_identfn.mbr_idn = mbr.mbr_idn AND mbr_identfn.identfd_type = :identfd_type_1 AND mbr_identfn.identfd_number = :identfd_number_1 AND mbr_identfn.entity_active = :entity_active_1) WHERE ROWNUM <= :ROWNUM_1) WHERE ora_rn > :ora_rn_1 call count cpu elapsed disk query current rows ------- ------ -------- ---------- ---------- ---------- ---------- ---------- Parse 9936 0.46 0.49 0 0 0 0 Execute 9936 0.60 0.59 0 0 0 0 Fetch 9936 329.87 404.00 0 136966922 0 0 ------- ------ -------- ---------- ---------- ---------- ---------- ---------- total 29808 330.94 405.09 0 136966922 0 0 Misses in library cache during parse: 0 Optimizer mode: FIRST_ROWS Parsing user id: 36 (JIVA_DEV) Rows Row Source Operation ------- --------------------------------------------------- 0 VIEW (cr=102 pr=0 pw=0 time=2180 us) 0 COUNT STOPKEY (cr=102 pr=0 pw=0 time=2163 us) 0 NESTED LOOPS (cr=102 pr=0 pw=0 time=2152 us) 0 INDEX SKIP SCAN IDX_MBR_IDENTFN (cr=102 pr=0 pw=0 time=2140 us)(object id 341053) 0 TABLE ACCESS BY INDEX ROWID MBR (cr=0 pr=0 pw=0 time=0 us) 0 INDEX UNIQUE SCAN PK_CLAIMANT (cr=0 pr=0 pw=0 time=0 us)(object id 334044) Rows Execution Plan ------- --------------------------------------------------- 0 SELECT STATEMENT MODE: HINT: FIRST_ROWS 0 VIEW 0 COUNT (STOPKEY) 0 NESTED LOOPS 0 INDEX MODE: ANALYZED (SKIP SCAN) OF 'IDX_MBR_IDENTFN' (INDEX (UNIQUE)) 0 TABLE ACCESS MODE: ANALYZED (BY INDEX ROWID) OF 'MBR' (TABLE) 0 INDEX MODE: ANALYZED (UNIQUE SCAN) OF 'PK_CLAIMANT' (INDEX (UNIQUE)) ******************************************************************************** Based on my reading of Oracle's documentation of skip scans, a skip scan is most useful when the first column of an index has a low number of unique values. The thing is that the first index of this column is a unique primary key. So am I correct in assuming that a skip scan is the wrong thing to do here? Also, what kind of scan should it be doing? Should I do some more hinting for this query? EDIT: I should also point out that the query's where clause uses the columns in IDX_MBR_IDENTFN and no columns other than what's in that index. So as far as I can tell, I'm not skipping any columns.

    Read the article

  • Find and replace string in MySQL using data from another table

    - by Charlie
    Hi, sorry for formatting this wonky but hope you can understand it. I have two MySql tables, and I want to find and replace text strings in one using data in another. Texts - one column: messages 'thx guys' 'i think u r great' 'thx again' ' u rock' Dictionary - two columns: bad_spelling, good_spelling 'thx' 'thanks' ' u ' ' you ' ' r ' ' are ' I want SQL to go through and look at every row in messages and replace every instance of bad_spelling with good_spelling, and to do this for all the pairs of bad_spelling and good_spelling The closest I have gotten is this: update texts, dictionary set texts.message = replace(texts.message, dictionary.bad_spelling, dictionary.good_spelling) But this only changes 'thx' to 'thanks' (in 2 rows) and does not go on to replace ' u ' with ' you' or ' r ' with ' are '. Any ideas how to make it use all the rows in dictionary in the replace statement? -- PS forgot to mention that this is a small example and in the real thing I will have a lot of find/replace pairs, which may get added to over time.

    Read the article

  • vb.net | Update DB with OleDB

    - by liron
    i wrote a module of a connection to DB with OleDB and the 'sub UpdateClients' doesn't work, the DB don't update. what's missing or wrong? Module mdlDB Const CONNECTION_STRING As String = _ "provider= Microsoft.Jet.OleDB.4.0;Data Source=DbHalf.mdb;mode= Share Deny None" Dim daClient As New OleDb.OleDbDataAdapter Dim dsClient As New DataSet Dim cmClient As CurrencyManager Public Sub OpenClients(ByVal txtId, ByVal txtName, ByVal BindingContext) Dim Con As New OleDb.OleDbConnection(CONNECTION_STRING) Dim sqlClient As New OleDb.OleDbCommand Con.Open() sqlClient.CommandText = "SELECT*" sqlClient.CommandText += "FROM tblClubClient" sqlClient.Connection = Con daClient.SelectCommand = sqlClient dsClient.Clear() daClient.Fill(dsClient, "CLUB_CLIENT") cmClient = BindingContext(dsClient, "CLUB_CLIENT") cmClient.Position = 0 txtId.DataBindings.Add("text", dsClient, "CLUB_CLIENT.ClntId") txtName.DataBindings.Add("text", dsClient, "CLUB_CLIENT.ClntName") Con.Close() End Sub Public Sub UpdateClients(ByVal txtId, ByVal txtName, ByVal BindingContext) Dim cb As New OleDb.OleDbCommandBuilder(daClient) cmClient = BindingContext(dsClient, "CLUB_CLIENT") dsClient.Tables("CLUB_CLIENT").Rows(cmClient.Position).Item("ClntId") = txtId.Text dsClient.Tables("CLUB_CLIENT").Rows(cmClient.Position).Item("ClntName") = txtName.Text daClient.Update(dsClient, "CLUB_CLIENT") End Sub End Module

    Read the article

  • How to populate List<string> with Datarow values from single columns...

    - by James
    Hi, I'm still learning (baby steps). Messing about with a function and hoping to find a tidier way to deal with my datatables. For the more commonly used tables throughout the life of the program, I'll dump them to datatables and query those instead. What I'm hoping to do is query the datatables for say column x = "this", and convert the values of column "y" directly to a List to return to the caller: private List<string> LookupColumnY(string hex) { List<string> stringlist = new List<string>(); DataRow[] rows = tblDataTable.Select("Columnx = '" + hex + "'", "Columny ASC"); foreach (DataRow row in rows) { stringlist.Add(row["Columny"].ToString()); } return stringlist; } Anyone know a slightly simpler method? I guess this is easy enough, but I'm wondering if I do enough of these if iterating via foreach loop won't be a performance hit. TIA!

    Read the article

< Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >