Search Results

Search found 4815 results on 193 pages for 'parameterized queries'.

Page 19/193 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • Optimal preferences for prefix queries with Oracle catalog (CTXCAT) index

    - by nw
    The documentation for Oracle Text gives this example of a prefix/substring preference setting for context and catalog indexes: begin ctx_ddl.create_preference('mywordlist', 'BASIC_WORDLIST'); ctx_ddl.set_attribute('mywordlist','PREFIX_INDEX','TRUE'); ctx_ddl.set_attribute('mywordlist','PREFIX_MIN_LENGTH', '3'); ctx_ddl.set_attribute('mywordlist','PREFIX_MAX_LENGTH', '4'); ctx_ddl.set_attribute('mywordlist','SUBSTRING_INDEX', 'YES'); end; What I need to know is whether the substring_index attribute is necessary if I only ever issue prefix searches, such as: SELECT title FROM auction WHERE CATSEARCH(title, 'cam*', '') > 0; TITLE --------------- CANON CAMERA FUJI CAMERA NIKON CAMERA OLYMPUS CAMERA PENTAX CAMERA SONY CAMERA 6 rows selected

    Read the article

  • nested linq-to-sql queries

    - by ile
    var result = ( from contact in db.Contacts join user in db.Users on contact.CreatedByUserID equals user.UserID orderby contact.ContactID descending select new ContactListView { ContactID = contact.ContactID, FirstName = contact.FirstName, LastName = contact.LastName, Company = (from field in contact.XmlFields.Descendants("Company") select field.Value).SingleOrDefault().ToString() }).Take(10); Here I described how my database tables look like. So, contacts table has one field that is xml type. In that field is stored Company filename and I need to read it. I tried it using this way: Company = (from field in contact.XmlFields.Descendants("Company") select field.Value).SingleOrDefault().ToString() }).Take(10); but I get following error: Member access 'System.String Value' of 'System.Xml.Linq.XElement' not legal on type 'System.Collections.Generic.IEnumerable`1[System.Xml.Linq.XElement]. Any solution for this? Thanks in advance, Ile

    Read the article

  • Queries with Multiple Constraints

    - by ANITHA
    I have the following tables and fields: +------------------+ +-------------------+ +---------------+ | Request | | RequestItem | | Item | +------------------+ +-------------------+ +---------------+ | + Requester_Name | | + Request_No | | + Item | +------------------+ +-------------------+ +---------------+ | + Request_No | | + Item | +------------------+ +-------------------+ I would like to filter the items which are selected under a particular request number, along with a specific requester name. How might I go about doing this?

    Read the article

  • ForceContext Queries method twice?

    - by azz0r
    Hello, I'm doing a per minute script, to output the xml that a server will read, I am used forceContext. <?php class My_Controller_Action_Helper_ForceContext extends Zend_Controller_Action_Helper_ContextSwitch { public function initContext($format = null) { $request = $this->getRequest(); $action = $request->getActionName(); $context = $this->getActionContexts($action); //check if this is the only context if(count($context) === 1) { $format = $context[0]; } return parent::initContext($format); } } class Video_PerMinuteController extends Zend_Controller_Action { function init() { $contextSwitch = $this->_helper->getHelper('ForceContext'); $contextSwitch->addActionContext('transaction', 'xml')->initContext(); In my method, it gets the current minute count, adds 1, then saves. So I can clearly see when its accessed more than once in a minute. If I comment out the second contextSwitch line, it only goes up 1, if Its not, it displays the xml page but adds 2 minutes (being called twice somehow). Any ideas?

    Read the article

  • Summing the results of Case queries in SQL

    - by David Stelfox
    I think this is a relatively straightforward question but I have spent the afternoon looking for an answer and cannot yet find it. So... I have a view with a country column and a number column. I want to make any number less than 10 'other' and then sum the 'other's into one value. For example, AR 10 AT 7 AU 11 BB 2 BE 23 BY 1 CL 2 I used CASE as follows: select country = case when number < 10 then 'Other' else country end, number from ... This replaces the countries values with less than 10 in the number column to other but I can't work out how to sum them. I want to end up with a table/view which looks like this: AR 10 AU 11 BE 23 Other 12 Any help is greatly appreciated. Cheers, David

    Read the article

  • SQl queries searching by date range

    - by tecno
    Hi, I have a table in an Access 2007 database, all fields are of type text. Can the following be done using the where clause. If so how? SELECT * from Table1 WHERE (ColumnDate is between 26th and 19th of march 2010) SELECT * from Table1 WHERE (ColumnAge is between 25 and 40) The usual < <= operators dont seem to work. Thanks,

    Read the article

  • Embedding generic sql queries into c# program

    - by Pooja Balkundi
    Okay referring to my first question code in the main, I want the user to enter employee name at runtime and then i take this name which user has entered and compare it with the e_name of my emp table , if it exists i want to display all information of that employee , how can I achieve this ? using System; using System.Collections.Generic; using System.Linq; using System.Windows.Forms; using MySql.Data.MySqlClient; namespace ConnectCsharppToMySQL { public class DBConnect { private MySqlConnection connection; private string server; private string database; private string uid; private string password; string name; //Constructor public DBConnect() { Initialize(); } //Initialize values private void Initialize() { server = "localhost"; database = "test"; uid = "root"; password = ""; string connectionString; connectionString = "SERVER=" + server + ";" + "DATABASE=" + database + ";" + "UID=" + uid + ";" + "PASSWORD=" + password + ";"; connection = new MySqlConnection(connectionString); } //open connection to database private bool OpenConnection() { try { connection.Open(); return true; } catch (MySqlException ex) { //When handling errors, you can your application's response based //on the error number. //The two most common error numbers when connecting are as follows: //0: Cannot connect to server. //1045: Invalid user name and/or password. switch (ex.Number) { case 0: MessageBox.Show("Cannot connect to server. Contact administrator"); break; case 1045: MessageBox.Show("Invalid username/password, please try again"); break; } return false; } } //Close connection private bool CloseConnection() { try { connection.Close(); return true; } catch (MySqlException ex) { MessageBox.Show(ex.Message); return false; } } //Insert statement public void Insert() { string query = "INSERT INTO emp (e_name, age) VALUES('Pooja R', '21')"; //open connection if (this.OpenConnection() == true) { //create command and assign the query and connection from the constructor MySqlCommand cmd = new MySqlCommand(query, connection); //Execute command cmd.ExecuteNonQuery(); //close connection this.CloseConnection(); } } //Update statement public void Update() { string query = "UPDATE emp SET e_name='Peachy', age='22' WHERE e_name='Pooja R'"; //Open connection if (this.OpenConnection() == true) { //create mysql command MySqlCommand cmd = new MySqlCommand(); //Assign the query using CommandText cmd.CommandText = query; //Assign the connection using Connection cmd.Connection = connection; //Execute query cmd.ExecuteNonQuery(); //close connection this.CloseConnection(); } } //Select statement public List<string>[] Select() { string query = "SELECT * FROM emp where e_name=(/*I WANT USER ENTERED NAME TO GET INSERTED HERE*/)"; //Create a list to store the result List<string>[] list = new List<string>[3]; list[0] = new List<string>(); list[1] = new List<string>(); list[2] = new List<string>(); //Open connection if (this.OpenConnection() == true) { //Create Command MySqlCommand cmd = new MySqlCommand(query, connection); //Create a data reader and Execute the command MySqlDataReader dataReader = cmd.ExecuteReader(); //Read the data and store them in the list while (dataReader.Read()) { list[0].Add(dataReader["e_id"] + ""); list[1].Add(dataReader["e_name"] + ""); list[2].Add(dataReader["age"] + ""); } //close Data Reader dataReader.Close(); //close Connection this.CloseConnection(); //return list to be displayed return list; } else { return list; } } public static void Main(String[] args) { DBConnect db1 = new DBConnect(); Console.WriteLine("Initializing"); db1.Initialize(); Console.WriteLine("Search :"); Console.WriteLine("Enter the employee name"); db1.name = Console.ReadLine(); db1.Select(); Console.ReadLine(); } } }

    Read the article

  • Stop 2 identical queries from executing almost simultaneously?

    - by James Simpson
    I have developed an AJAX based game where there is a bug caused (very remote, but in volume it happens at least once per hour) where for some reason two requests get sent to the processing page almost simultaneously (the last one I tracked, the requests were a difference of .0001 ms). There is a check right before the query is executed to make sure that it doesn't get executed twice, but since the difference is so small, the check hasn't finished before the next query gets executed. I'm stumped, how can I prevent this as it is causing serious problems in the game. Just to be more clear, the query is starting a new round in the game, so when it executes twice, it starts 2 rounds at the same time which breaks the game, so I need to be able to stop the script from executing if the previous round isn't over, even if that previous round started .0001 ms ago.

    Read the article

  • Access: strange results with queries against MDB file

    - by Craig Johnston
    I am running the following SQL against an MDB file, a copy of which is located here: http://hotfile.com/dl/40641614/2353dfc/test.mdb.html (perfectly clean file, no macros or viruses) SELECT datediff("d", MAX(invoice.date), Now) As Date_Diff , MAX(invoice.date) AS max_invoice_date , customer.number AS customer_number FROM invoice INNER JOIN customer ON invoice.customer_number = customer.number GROUP BY customer.number If the the following was added: HAVING datediff("d", MAX(invoice.date), Now) > 365 would this simply exclude rows with Date_Diff <= 365? What should be the effect of the HAVING clause here?

    Read the article

  • Strange behavior with complex Q object filter queries in Django

    - by HWM-Rocker
    Hi I am trying to write a tagging system for Django, but today I encountered a strange behavior in filter or the Q object (django.db.models.Q). I wrote a function, that converts a search string into a Q object. The next step would be to filter the TaggedObject with these query. But unfortunately I get a strange behavior. when I search (id=20) = Q: (AND: ('tags__tag__id', 20)) and it returns 2 Taged Objects with the ID 1127 and 132 when I search (id=4) = Q: (AND: ('tags__tag__id', 4)) and it returns also 2 Objects, but this time 1180 and 1127 until here is everything fine, but when i make a little bit more complex query like (id=4) or (id=20) = Q: (OR: ('tags__tag__id', 4), ('tags__tag__id', 20)) then it returns 4(!) Objects 1180, 1127, 1127, 132 But the object with the ID 1127 is returned twice, but thats not the behaviour I want. Do I have to live with it, and uniqify that list or can I do something different. The representation of the Q object looks fine for me. But the worst is now, when I search for (id=20) and (id=4) = Q: (AND: ('tags__tag__id', 20), ('tags__tag__id', 4)) then it returns no object at all. But why? The representation should be ok and the object with the id 1127 is tagged by both. What am I missing? Here are also the relevant parts of the classes, that are involved: class TaggedObject(models.Model): """ class that represent a tagged object """ tags = generic.GenericRelation('ObjectTagBridge', blank=True, null=True) class ObjectTagBridge(models.Model): """ Help to connect a generic object to a Tag. """ # pylint: disable-msg=W0232,R0903 content_type = models.ForeignKey(ContentType) object_id = models.PositiveIntegerField() content_object = generic.GenericForeignKey('content_type', 'object_id') tag = models.ForeignKey('Tag') class Tag(models.Model): ... Thanks for your help

    Read the article

  • Organizing Eager Queries in an ObjectContext

    - by Nix
    I am messing around with Entity Framework 3.5 SP1 and I am trying to find a cleaner way to do the below. I have an EF model and I am adding some Eager Loaded entities and i want them all to reside in the "Eager" property in the context. We originally were just changing the entity set name, but it seems a lot cleaner to just use a property, and keep the entity set name in tact. Example: Context - EntityType - AnotherType - Eager (all of these would have .Includes to pull in all assoc. tables) - EntityType - AnotherType Currently I am using composition but I feel like there is an easier way to do what I want. namespace Entities{ public partial class TestObjectContext { EagerExtensions Eager { get;set;} public TestObjectContext(){ Eager = new EagerExtensions (this); } } public partial class EagerExtensions { TestObjectContext context; public EagerExtensions(TestObjectContext _context){ context = _context; } public IQueryable<TestEntity> TestEntity { get { return context.TestEntity .Include("TestEntityType") .Include("Test.Attached.AttachedType") .AsQueryable(); } } } } public class Tester{ public void ShowHowIWantIt(){ TestObjectContext context= new TestObjectContext(); var query = from a in context.Eager.TestEntity select a; } }

    Read the article

  • mySQL - One large query vs Ajax indivdual queries

    - by Mark
    Hi guys, I guess no one will have a definative answer to this but considered predictions would be appriciated. I am in the process of developing a mySQL database for a web application and my question is: Is it more efficient to make a single query that returns a single row using AJAX or To request 100 - 700 rows when the user will likely only ever use the results of two or three? Really I am asking what is heavier for the server 2-3 requests with one result or 1 request with 100 - 700 results? Thanks, Mark

    Read the article

  • Large number of UPDATE queries slowing down page

    - by Bryan Lewis
    I am reading and validating large fixed-width text files (range from 10-50K lines) that are submitted via our ASP.net website (coded in VB.Net). I do an initial scan of the file to check for basic issues (line length, etc). Then I import each row into a MS SQL table. Each DB rows basically consists of a record_ID (Primary, auto-incrementing) and about 50 varchar fields. After the insert is done, I run a validation function on the file that checks each field in each row based on a bunch of criteria (trimmed length, isnumeric, range checks, etc). If it finds an error in any field, it inserts a record into the Errors table, which has an error_ID, the record_ID and an error message. In addition, if the field fails in a particular way, I have to do a "reset" on that field. A reset might consist of blanking the entire field, or simply replacing the value with another value (e.g. replacing the string with a new one that has all illegals chars taken out). I have a 5,000 line test file. The upload, initial check, and import takes about 5-6 seconds. The detailed error check and insert into the Errors table takes about 5-8 seconds (this file has about 1200 errors in it). However, the "resets" part takes about 40-45 seconds for 750 fields that need to be reset. When I comment out the resets function (returning immediately without actually calling the UPDATE stored proc), the process is very fast. With the resets turned on, the pages take 50 seconds to return. My UPDATE stored proc is using some recommended code from http://sommarskog.se/dynamic_sql.html, whereby it uses CASE instead of dynamic SQL: UPDATE dbo.Records SET dbo.Records.file_ID = CASE @field_name WHEN 'file_ID' THEN @field_value ELSE file_ID END, . . (all 50 varchar field CASE statements here) . WHERE dbo.Records.record_ID = @record_ID Is there any way I can help my performance here. Can I somehow group all of these UPDATE calls into a single transaction? Should I be reworking the UPDATE query somehow? Or is it just sheer quantity of 750+ UPDATEs and things are just slow (it's a quad proc server with 8GB ram). Any suggestions appreciated.

    Read the article

  • LinqToSql Select to a class then do more queries

    - by fyjham
    I have a LINQ query running with multiple joins and I want to pass it around as an IQueryable<T> and apply additional filters in other methods. The problem is that I can't work out how to pass around a var data type and keep it strongly typed, and if I try to put it in my own class (EG: .Select((a,b) => new MyClass(a,b))) I get errors when I try to add later Where clauses because my class has no translations into SQL. Is there any way I can do one of the following: Make my class map to SQL? Make the var data-type implement an interface (So I can pass it round as though it's that)? Something I haven't though of that'll solve my issue? Example: public void Main() { using (DBDataContext context = new DBDataContext()) { var result = context.TableAs.Join( context.TableBs, a => a.BID, b => b.ID, (a,b) => new {A = a, B = b} ); result = addNeedValue(result, 4); } } private ???? addNeedValue(???? result, int value) { return result.Where(r => r.A.Value == value); } PS: I know in my example I can flatten out the function easily, but in the real thing it'd be an absolute mess if I tried.

    Read the article

  • Run SQL Queries on DataTables, or similar, in .Net, without an RDBMS

    - by FastAl
    I'd like to have a dataset or datatables, and be able to run SQL statements on them, without using any external RDBMS. For Example, to take take 2 datatables in a dataset and just join them outright with a SQL statement and Where clause, the result being a new datatable? For example if I have 2 datatables, named People and Addresses in a dataset (that I built using code, not getting from a database .. pardon the old fashioned Join syntax): dim dtJoined as DataTable = MyDataSet.RunSQLQuery ("Select * from People, Orders Where People.PersonID=Orders.OrdereID") Thanks

    Read the article

  • Database time data retrieval, time based queries

    - by Raphael Pineda
    I am new to time manipulation or time arithmetic operations and am currently developing a navigation system with Web server based information and currently I have this Database that contains a table peek hours whose columns are id, start_time, end_time , edge_id, day_of_the_week, edge_weight ------------------------------------------------------------------------ | Peek Hours | ------------------------------------------------------------------------ | | | | | | | | id | start_time | end_time | edge_id | day_of_the_week | edge_weight | | | | | | | | ------------------------------------------------------------------------ I am using PHP as a webservice and so based on the current time i want to get all the records that would fit this equation start_time< current_time < end_time

    Read the article

  • How are Reads Distributed in a Workload

    - by Bill Graziano
    People have uploaded nearly one millions rows of trace data to TraceTune.  That’s enough data to start to look at the results in aggregate.  The first thing I want to look at is logical reads.  This is the easiest metric to identify and fix. When you upload a trace, I rank each statement based on the total number of logical reads.  I also calculate each statement’s percentage of the total logical reads.  I do the same thing for CPU, duration and logical writes.  When you view a statement you can see all the details like this: This single statement consumed 61.4% of the total logical reads on the system while we were tracing it.  I also wanted to see the distribution of reads across statements.  That graph looks like this: On average, the highest ranked statement consumed just under 50% of the reads on the system.  When I tune a system, I’m usually starting in one of two modes: this “piece” is slow or the whole system is slow.  If a given piece (screen, report, query, etc.) is slow you can usually find the specific statements behind it and tune it.  You can make that individual piece faster but you may not affect the whole system. When you’re trying to speed up an entire server you need to identity those queries that are using the most disk resources in aggregate.  Fixing those will make them faster and it will leave more disk throughput for the rest of the queries. Here are some of the things I’ve learned querying this data: The highest ranked query averages just under 50% of the total reads on the system. The top 3 ranked queries average 73% of the total reads on the system. The top 10 ranked queries average 91% of the total reads on the system. Remember these are averages across all the traces that have been uploaded.  And I’m guessing that people mainly upload traces where there are performance problems so your mileage may vary. I also learned that slow queries aren’t the problem.  Before I wrote ClearTrace I used to identify queries by filtering on high logical reads using Profiler.  That picked out individual queries but those rarely ran often enough to put a large load on the system. If you look at the execution count by rank you’d see that the highest ranked queries also have the highest execution counts.  The graph would look very similar to the one above but flatter.  These queries don’t look that bad individually but run so often that they hog the disk capacity. The take away from all this is that you really should be tuning the top 10 queries if you want to make your system faster.  Tuning individually slow queries will help those specific queries but won’t have much impact on the system as a whole.

    Read the article

  • Tweak Data in Doctrine Migration (say, by running some arbitrary queries)

    - by timdev
    I've got a migration that creates a couple of columns that act as foreign keys. In particular, I'm adding a creator_id and owner_id column to a model. These foreign keys indicate who created, and who currently owns, a particular kind of domain object. Users are managed by sfDoctrineGuardPlugin. What I'd like to do, for the purpose of the migration, is look up the (active) user with the lowest id, and default creator_id/owner_id to that. I don't see any particularly obvious/proper way to run arbitrary operations on the database during a migration. Does anyone know how?

    Read the article

  • My Lucene queries only ever find one hit

    - by Bob
    I'm getting started with Lucene.Net (stuck on version 2.3.1). I add sample documents with this: Dim indexWriter = New IndexWriter(indexDir, New Standard.StandardAnalyzer(), True) Dim doc = Document() doc.Add(New Field("Title", "foo", Field.Store.YES, Field.Index.TOKENIZED, Field.TermVector.NO)) doc.Add(New Field("Date", DateTime.UtcNow.ToString, Field.Store.YES, Field.Index.TOKENIZED, Field.TermVector.NO)) indexWriter.AddDocument(doc) indexWriter.Close() I search for documents matching "foo" with this: Dim searcher = New IndexSearcher(indexDir) Dim parser = New QueryParser("Title", New StandardAnalyzer()) Dim Query = parser.Parse("foo") Dim hits = searcher.Search(Query) Console.WriteLine("Number of hits = " + hits.Length.ToString) No matter how many times I run this, I only ever get one result. Any ideas?

    Read the article

  • Appengine (python) returns empty for valid queries

    - by Grant
    I've got an app with around half a million 'records', each of which only stores three fields. I'd like to look up records by a string field with a query, but I'm running into problems. If I visit the console page, manually view a record and save it (without making changes) it shows up in a query: SELECT * FROM wordEntry WHERE wordStr = 'SomeString' If I don't do this, I get 'no results'. Does appengine need time to update? If so, how much? (I was also having trouble batch deleting and modifying data, but I was able to break the problem up into smaller chunks.)

    Read the article

  • mysql insert data from multiple select queries

    - by daulex
    What I've got working and it's what I need to improve on: INSERT form_data (id,data_id, email) SELECT fk_form_joiner_id AS data_id, value AS email FROM wp_contactform_submit_data WHERE form_key='your-email' This just gets the emails, now this is great, but not enough as I have a good few different values of form_key that I need to import into different columns, I'm aware that I can do it via php using foreach loops and updates, but this needs to be done purely in mysql. So how do I do something like: insert form_data(id,data,email,name,surname,etc) Select [..],Select [..].... Please help

    Read the article

  • Generated queries contain schema and catalog name

    - by stacker
    I've the same problem as described here In the generated SQL Informix expects catalog:schema.table but what's actually generated is catalog.schema.table which leads to a syntax error. Setting: hibernate.default_catalog= hibernate.default_schema= had no effect. I even removed schema and catalog from the table annotation, this caused a different issues : the query looked like that ..table same for setting catalog and schema to an empty string. Versions seam 2.1.2 Hibernate Annotations 3.3.1.GA.CP01 Hibernate 3.2.4.sp1.cp08 Hibernate EntityManager 3.3.2.GAhibernate Jboss 4.3 (similar to 4.2.3)

    Read the article

  • Nested sql queries in rails when :has_and_belongst_to_many

    - by Godisemo
    Hello, In my application I the next task that has not already been done by a user. I have Three models, A Book that has many Tasks and then I have a User that has has and belongs to many tasks. The table tasks_users table contains all completed tasks so I need to write a complex query to find the next task to perform. I have came up with two solutions in pure SQL that works, but I cant translate them to rails, thats what I need help with SELECT * FROM `tasks` WHERE `tasks`.`book_id` = @book_id AND `tasks`.`id` NOT IN ( SELECT `tasks_users`.`task_id` FROM `tasks_users` WHERE `tasks_users`.`user_id` = @user_id) ORDER BY `task`.`date` ASC LIMIT 1; and equally without nested select SELECT * FROM tasks LEFT JOIN tasks_users ON tasks_users.tasks_id = task.id AND tasks_users.user_id = @user_id WHERE tasks_users.task_id IS NULL AND tasks.book_id = @book_id LIMIT 1; This is what I Have done in rails with the MetaWhere plugin book.tasks.joins(:users.outer).where(:users => {:id => nil}) but I cant figure out how to get the current user there too, Thanks for any help!

    Read the article

  • SQL - Outer Join 2 queries?

    - by Stuav
    I have two querys. Query 1 gives me this result: Day New_Users 01-Jan-12 45 02-Jan-12 36 and so on. Query 2 gives me this result: Day Retained_Users 01-Jan-12 33 02-Jan-12 30 and so on. I want a new query that will join this together and read: Day New_Users Retained_Users 01-Jan-12 45 33 02-Jan-12 36 30 Do I use some sort of outer join?

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >