Search Results

Search found 7311 results on 293 pages for 'rows'.

Page 202/293 | < Previous Page | 198 199 200 201 202 203 204 205 206 207 208 209  | Next Page >

  • Multiple LIKE in SQL

    - by ninumedia
    I wanted to search through multiple rows and obtain the row that contains a particular item. The table in mySQL is setup so each id has a unique list (comma-delimited) of values per row. Ex: id | order 1 | 1,3,8,19,34,2,38 2 | 4,7,2,190,38 Now if I wanted to pull the row that contained just the number 19 how would I go about doing this? The possibilities I could figure in the list with a LIKE condition would be: 19, ,19 ,19, I tried the following and I cannot obtain any results, Thank you for your help! SELECT * FROM categories WHERE order LIKE '19,%' OR '%,19%' OR '%,19%' LIMIT 0 , 30

    Read the article

  • C# Importing Large Volume of Data from CSV to Database

    - by guazz
    What's the most efficient method to load large volumes of data from CSV (3 million + rows) to a database. The data needs to be formatted(e.g. name column needs to be split into first name and last name, etc.) I need to do this in a efficiently as possible i.e. time constraints I am siding with the option of reading, transforming and loading the data using a C# application row-by-row? Is this ideal, if not, what are my options? Should I use multithreading?

    Read the article

  • ASP.NET MVC: Is is possible to set a global variable?

    - by Sergio
    Hello, I have a process within my MVC 2 application that takes a large amount of time and alters many rows in the database in the process. There is a chance that two or more users could attempt to perform this action at the same time, which would lead to undesirable effects. Is there a way to set a global flag somewhere within asp.net that I can check against all requests to see if the action in question is currently being executed? (a bit that I flip prior to running query, and then then flip back on completition) Or is there a better way of handling this situation? Thanks

    Read the article

  • Database Design Question

    - by deniz
    Hi, I am designing a database for a project. I have a table that has 10 columns, most of them are used whenever the table is accessed, and I need to add 3 more rows; View Count Thumbs Up (count) Thumbs Down (Count) which will be used on %90 of the queries when the table is accessed. So, my question is that whether it is better to break the table up and create new table which will have these 3 columns + Foreign ID, or just make it 13 columns and use no joins? Since these columns will be used frequently, I guess adding 3 more columns is better, but if I need to create 10 more columns which will be used %90 of the time, should I add them as well, or create a new table and use joins? I am not sure when to break the table if the columns are used very frequently. Do you have any suggestions? Thanks in advance,

    Read the article

  • Choosing proper database for a few users application

    - by tomo
    Requirements: tiny WinForms client app (C# 4.0, WinForms or WPF) a few users working simultinausly no database service at all - the whole engine as *.DLLs inside client apps database available as shared folder on one computer at least simple concurrrency checks compatible with nHibernate or EntityFramework / NET 4.0 backup as simple as copying files from shared folder - assuming no running clients at the moment no stored procedures/triggers required data size - a few tables and a few thousands rows after 2 years Nice to have: user access rights encrypted data I'm trying to choose between: MS Access SqlLite SqlServer Compact Edition. Can you recommend which one should be the best for these requirements?

    Read the article

  • how to aggregate this data in R

    - by stevejb
    Hello, I have a data frame in R with the following structure. > testData date exch.code comm.code oi 1 1997-12-30 CBT 1 468710 2 1997-12-23 CBT 1 457165 3 1997-12-19 CBT 1 461520 4 1997-12-16 CBT 1 444190 5 1997-12-09 CBT 1 446190 6 1997-12-02 CBT 1 443085 .... 77827 2004-10-26 NYME 967 10038 77828 2004-10-19 NYME 967 9910 77829 2004-10-12 NYME 967 10195 77830 2004-09-28 NYME 967 9970 77831 2004-08-31 NYME 967 9155 77832 2004-08-24 NYME 967 8655 What I want to do is produce a table the shows for a given date and commodity the total oi across every exchange code. So, the rows would be made up of unique(testData$date) and the columns would be unique(testData$comm.code) and each cell would be the total oi over all exch.codes on a given day. Thanks,

    Read the article

  • Storing search result for paging and sorting

    - by Mattias
    I've been implementing MS Search Server 2010 and so far its really good. Im doing the search queries via their web service, but due to the inconsistent results, im thinking about caching the result instead. The site is a small intranet (500 employees), so it shouldnt be any problems, but im curious what approach you would take if it was a bigger site. I've googled abit, but havent really come over anything specific. So, a few questions: What other approaches are there? And why are they better? How much does it cost to store a dataview of 400-500 rows? What sizes are feasible? Other points you should take into consideration. Any input is welcome :)

    Read the article

  • Filter by virtual column?

    - by user329957
    I have the following database structure : [Order] OrderId Total [Payment] OrderId Amount Every Order can have X payment rows. I want to get only the list of orders where the sum of all the payments are < than the order Total. I have the following SQL but I will return all the orders paid and unpaid. SELECT o.OrderId, o.UserId, o.Total, o.DateCreated, COALESCE(SUM(p.Amount),0) AS Paid FROM [Order] o LEFT JOIN Payment p ON p.OrderId = o.OrderId GROUP BY o.OrderId, o.Total, o.UserId, o.DateCreated I have tried to add Where (Paid < o.Total) but it does not work, any idea? BTM I'm using SQL CE 3.5

    Read the article

  • Validating Column Data Stored as CSV Against Another Table

    - by Jakkwylde
    I wanted to see what some suggested approaches would be to validate a field that is stored as a CSV against a table containing appropriate values. Althought it would be desired, it is NOT an option to split the CSV list into another related table. In the example data below I would be trying to capture the code 99 for widget A. Below is an example data representation. Table: Widgets WidgetName WidgetCodeList A 1, 2, 3 B 1 C 2, 3 D 99 Table: WidgetCodes WidgetCode 1 2 3 An earlier approach was to query the CSV column as rows using various string manipulations and CONNECT_BY_LEVEL however the performance was not acceptible.

    Read the article

  • C++ - Opening a file inside a function using fopen

    - by Josh
    I am using Visual Studio 2005 (C++). I am passing a string into a function as a char array. I want to open the file passed in as a parameter and use it. I know my code works to an extent, because if I hardcode the filename as the first parameter it works perfectly. I do notice if I look at the value as a watch, the value includes the address aside the string literal. I have tried passing in the filename as a pointer, but it then complains about type conversion with __w64. As I said before it works fine with "filename.txt" in place of fileName. I am stumped. void read(char fileName[50],int destArray[MAX_R][MAX_C],int demSize[2]) { int rows=0; int cols=0; int row=0; int col=0; FILE * f = fopen(fileName,"r"); ...

    Read the article

  • How to create reusable WPF grid layout

    - by zendar
    I have a window with tab control and number of pages - tab items. Each tab item has same grid layout - 6 rows and 4 columns. Now, each tab item contains grid with row and column definitions, so almost half of XAML is definition of grids. How can I define this grid in one place and reuse that definition in my application? Template? User control? Besides 6x4, I have only two more grid dimensions that repeat: 8x4 and 6x6.

    Read the article

  • Postgres : Post statement (or insert) asynchronous, non-blocking processing.

    - by Hassan Syed
    I'm wondering if it is possible, that after a collection of rows is inserted, to initiate an operation that is executed asynchronously, is non-blocking, and doesn't need to inform the originator of the request - of the result. I am working with large amounts of events and I can guarantee that the post-insert logic will not fail -- I just want to have a single insert thread in my event-sources, and I want this thread to keep flying without blocking, and without being responsible for any post-delivery book-keeping. I can tell you that I would potentially have a 100 of these jobs executing concurrently and each job might operate on 5 tables with anywhere between 200-1000 inserts on each of these tables. A hint in the right direction should be enough.

    Read the article

  • Changing ylim (axis limits) drops data falling outside range. How can this be prevented?

    - by Alex Holcombe
    df <- data.frame(age=c(10,10,20,20,25,25,25),veg=c(0,1,0,1,1,0,1)) g=ggplot(data=df,aes(x=age,y=veg)) g=g+stat_summary(fun.y=mean,geom="point") Points reflect mean of veg at each age, which is what I expected and want to preserve after changing axis limits with the command below. g=g+ylim(0.2,1) Changing axis limits with the above command unfortunately causes veg==0 subset to be dropped from the data, yielding "Warning message: Removed 4 rows containing missing values (stat_summary)" This is bad because now the data plot (stat_summary mean) omits the veg==0 points. How can this be prevented? I simply want to avoid showing the empty part of the plot- the ordinate from 0 to .2, but not drop the associated data from the stat_summary calculation.

    Read the article

  • Fulltext and composite indexes and how they affect the query

    - by Brett
    Just say I had a query as below.. SELECT name,category,address,city,state FROM table WHERE MATCH(name,subcategory,category,tag1) AGAINST('education') AND city='Oakland' AND state='CA' LIMIT 0, 10; ..and I had a fulltext index as name,subcategory,category,tag1 and a composite index as city,state; is this good enough for this query? Just wondering if something extra is needed when mixing additional AND's when making use of the fulltext index with the MATCH/AGAINST. Edit: What I am trying to understand is, what happens with the additional columns that are within the query but are not indexed in the chosen index (the fulltext index), the above example being city and state. How does MySQL now find the matching rows for these since it can't use two indexes (or can it?) - so, basically, I'm trying to understand how MySQL goes about finding the data optimally for the columns NOT in the chosen fulltext index and if there is anything I can or should do to optimize the query.

    Read the article

  • JOIN two tables to show already purchased items

    - by Norbert
    I have a table where I keep all my templates: templates template_id template_name template_price These templates can be purchased by a registered user and then are inserted in the payments table: payments payment_id template_id user_id Is there a way to join these two tables and get not just a list of templates that have been purchased by a certain user, but all the templates? And then figure out from there which ones have already been purchased? I used this SELECT, but only the ones that the user bought showed up. I would like to have all the rows from templates, but empty in case the user_id doesn't match. SELECT * FROM templates LEFT JOIN payments ON templates.template_id = payments.template_id WHERE user_id = 2 GROUP BY templates.template_id

    Read the article

  • T-SQL: How to use GROUP BY and getting the value which excesses 60%?

    - by Torben H.
    Hello, sorry for the bad title, I don't know how to describe my problem. I have the following table: | ItemID | Date | ------------------------- | 1 | 01.01.10 | | 1 | 03.01.10 | | 1 | 05.01.10 | | 1 | 06.01.10 | | 1 | 10.01.10 | | 2 | 05.01.10 | | 2 | 10.01.10 | | 2 | 20.01.10 | Now I want to GROUP BY ItemID and for the date I want to get the value, which excesses 60%. What I mean is, that for item 1 I've five rows, so each have a percentage of 20% and for item 2 I've three row, so each have a percentage of 33,33%. So for item 1 I need the 3rd and for item 2 the 2nd value, so that the result looks like that. | ItemID | Date | ------------------------- | 1 | 06.01.10 | | 2 | 10.01.10 | Is there a easy way so get this data? Maybe using OVER? Thank you Torben

    Read the article

  • same flash file (.swf) downloaded multiple times on a page

    - by Gunjan
    I have a page that has a table with each row corresponding to an audio file. The last cell in each row embeds a simple flash audio player. The problem is that the flash file for the player is being downloaded for each row separately and as soon as rows go beyond 40-50 it crashes the browser. I tried using different players (1pixelout, flash-mp3-player) and the problem is still there, so its not a player specific issue. Is there any way to cache the player so that it is only downloaded once?

    Read the article

  • jQuery - OnClick, change background color for table cells

    - by andrew
    Hi all, Let me show you a demo: here it is working for only rows. its not working for cells. i want to change cells' (tds') background colors with mouse clicks. For example: a have a table, and it has 4 tds. table's background color is white. if i click to a td, a td should be red, than if i click to b, b td should be red and a td should be white again. if i click to c than, c should be red and b should be white right now. A - B C - D Can anyone help me?

    Read the article

  • Trying to get JQuery Autocomplete working on Asp.Net page.

    - by JasonMHirst
    Can someone shed some light on the problem please: I have the following: $(document).ready(function () { $("#txtFirstContact").autocomplete({url:'http://localhost:7970/Home/FindSurname' }); }); On my Asp.Net page. The http request is a function on an MVC Controller and that code is here: Function FindSurname(ByVal surname As String, ByVal count As Integer) Dim sqlConnection As New SqlClient.SqlConnection sqlConnection.ConnectionString = My.Settings.sqlConnection Dim sqlCommand As New SqlClient.SqlCommand sqlCommand.CommandText = "SELECT ConSName FROM tblContact WHERE ConSName LIKE '" & surname & "%'" sqlCommand.Connection = sqlConnection Dim ds As New DataSet Dim da As New SqlClient.SqlDataAdapter(sqlCommand) da.Fill(ds, "Contact") sqlConnection.Close() Dim contactsArray As New List(Of String) For Each dr As DataRow In ds.Tables("Contact").Rows contactsArray.Add(dr.Item("ConSName")) Next Return Json(contactsArray, JsonRequestBehavior.AllowGet) End Function As far as I'm aware, the Controller is returning JSON data, however I don't know if the Function Parameters are correct, or indeed if the format returned is interprettable by the AutoComplete plugin. If anyone can assist in the matter I'd really appreciate it.

    Read the article

  • In python writing from XML to CSV, encoding error

    - by user574435
    Hi, I am trying to convert an XML file to CSV, but the encoding of the XML ("ISO-8859-1") apparently contains characters that are not in the ascii codec which Python uses to write rows. I get the error: Traceback (most recent call last): File "convert_folder_to_csv_PLAYER.py", line 139, in <module> xml2csv_PLAYER(filename) File "convert_folder_to_csv_PLAYER.py", line 121, in xml2csv_PLAYER fout.writerow(row) UnicodeEncodeError: 'ascii' codec can't encode character u'\xe1' in position 4: ordinal not in range(128) I have tried opening the file as follows: dom1 = parse(input_filename.encode( "utf-8" ) ) and I have tried replacing the \xe1 character in each row before it is written. Any suggestions?

    Read the article

  • Approach for altering Primary Key from GUID to BigInt in SQL Server related tables

    - by Tom
    I have two tables with 10-20 million rows that have GUID primary keys and at leat 12 tables related via foreign key. The base tables have 10-20 indexes each. We are moving from GUID to BigInt primary keys. I'm wondering if anyone has any suggestions on an approach. Right now this is the approach I'm pondering: Drop all indexes and fkeys on all the tables involved. Add 'NewPrimaryKey' column to each table Make the key identity on the two base tables Script the data change "update table x, set NewPrimaryKey = y where OldPrimaryKey = z Rename the original primarykey to 'oldprimarykey' Rename the 'NewPrimaryKey' column 'PrimaryKey' Script back all the indexes and fkeys Does this seem like a good approach? Does anyone know of a tool or script that would help with this? TD: Edited per additional information. See this blog post that addresses an approach when the GUID is the Primary: http://www.sqlmag.com/blogs/sql-server-questions-answered/sql-server-questions-answered/tabid/1977/entryid/12749/Default.aspx

    Read the article

  • How to make a EditText work correctly in a ListView?

    - by TianDong
    Hallo all, I have a ListView, which contains an EditText in each of it's row. I also have an Array.The length of the Array==the Nr of the rows in the ListView. I want to store the user input (the text in the EditText) to the Arrray. E.g, if i type some text in the EditText in the first row of the ListView, i want the text to be stored in Array[0]. But how can i detect to which row the EditText belongs to? I can detect the possition of the row if the row contains a RadioGroup, but not a EditText. What if i first type some text in the EditText and sometime later i want to update mein Input? How can i update it? Thanks a lot!

    Read the article

  • jQuery .closest returns undefined

    - by Andy Holmes
    I've got the code below which works fine, however the jquery to add the items doesnt find the data-parent-room value and just returns undefined. This is the only thing not working :( HTML: <div id="inventoryRooms"> <!--BOX SHART--> <div class="widget box formHolder" data-parent-room="1"> <!--ROOM NAME--> <form class="widget-header rooms"> <input type="text" placeholder="Type Room name" name="roomName[]" class="form-input add-room-input input-width-xxlarge"> <input type="hidden" class="roomId" name="roomId[]"> <input type="hidden" class="inventoryId" name="inventoryId[]" value="<?=$_GET['inventory_id']?>"> <div class="toolbar no-padding"> <div class="btn-group"> <span class="btn saveRoom"><i class="icon-ok"></i> Save Room</span> </div> </div> </form> <!--/END--> <!--GENERIC ROW TITLES--> <div class="widget-header header-margin hide"> <div class="row row-title"> <div class="col-md-3"><h5>ITEM</h5></div> <div class="col-md-3"><h5>DESCRIPTION</h5></div> <div class="col-md-3"><h5>CONDITION</h5></div> <div class="col-md-2"><h5>PHOTOGRAPH</h5></div> <div class="col-md-1 align-center"><h5><i class="icon-cog"> </i></h5></div> </div> </div> <!--/END--> <!--ADD ITEM--> <div class="items"> </div> <!--/END--> <div class="toolbar-small"> <div class="btn-group"> <span class="btn addItem"><i class="icon-plus"></i> Add Item</span> <span data-toggle="dropdown" class="btn dropdown-toggle"><i class="icon-gear"></i> Options<span class="button-space"></span><i class="icon-angle-down"></i></span> <ul class="dropdown-menu pull-right"> <li><a href="#"><i class="icon-trash"></i> Delete Room</a></li> </ul> </div> </div> </div> </div> jQuery: $(document).on('click','.addItem', function(){ $('<!--ROW START-->\ <form class="widget-content item">\ <div class="row">\ <div class="col-md-3"><input type="text" class="form-control" name="itemName[]"></div>\ <div class="col-md-3"><textarea class="auto form-control" name="itemDescription[]" cols="20" rows="1" style="word-wrap: break-word; resize: vertical;"></textarea></div>\ <div class="col-md-3"><textarea class="auto form-control" name="itemCondition[]" cols="20" rows="1" style="word-wrap: break-word; resize: vertical;"></textarea></div>\ <input type="hidden" class="itemId" name="itemId[]" value="">\ <input type="hidden" name="itemInventoryId[]" value="<?=$_GET["inventory_id"]?>">\ <input type="hidden" name="itemParent[]" value="'+$(this).closest().attr('data-parent-room')+'">\ <div class="col-md-2">\ <div class="fileinput-holder input-group">\ <input id="fileupload" type="file" name="files[]" data-url="uploads/">\ </div>\ </div>\ <div class="col-md-1 align-center"><i class="save icon-ok large"> </i>&nbsp;&nbsp;&nbsp;<i class="delete icon-trash large"> </i></div>\ </div>\ </form>\ <!--/ROW END-->').fadeIn(500).appendTo($(this).parents().siblings('.items')); $(this).parent().parent().siblings('.widget-header, .header-margin, .hide').removeClass('hide').fadeIn(); }); Like i say, it all works fine apart from that damn data-parent-room value. Any help is appreciated! using jQuery 1.10.1

    Read the article

  • Efficient persistent storage for simple id to table of values map for java

    - by wds
    I need to store some data that follows the simple pattern of mapping an "id" to a full table (with multiple rows) of several columns (i.e. some integer values [u, v, w]). The size of one of these tables would be a couple of KB. Basically what I need is to store a persistent cache of some intermediary results. This could quite easily be implemented as simple sql, but there's a couple of problems, namely I need to compress the size of this structure on disk as much as possible. (because of amount of values I'm storing) Also, it's not transactional, I just need to write once and simply read the contents of the entire table, so a relational DB isn't actually a very good fit. I was wondering if anyone had any good suggestions? For some reason I can't seem to come up with something decent atm. Especially something with an API in java would be nice.

    Read the article

  • SQL statement HAVING MAX(some+thing)=some+thing

    - by Andreas
    I'm having trouble with Microsoft Access 2003, it's complaining about this statement: select cardnr from change where year(date)<2009 group by cardnr having max(time+date) = (time+date) and cardto='VIP' What I want to do is, for every distinct cardnr in the table change, to find the row with the latest (time+date) that is before year 2009, and then just select the rows with cardto='VIP'. This validator says it's OK, Access says it's not OK. This is the message I get: "you tried to execute a query that does not include the specified expression 'max(time+date)=time+date and cardto='VIP' and cardnr=' as part of an aggregate function." Could someone please explain what I'm doing wrong and the right way to do it? Thanks

    Read the article

< Previous Page | 198 199 200 201 202 203 204 205 206 207 208 209  | Next Page >