Search Results

Search found 4788 results on 192 pages for 'adhoc queries'.

Page 28/192 | < Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >

  • Whats better query with long where in condition or many small queries?

    - by DCrystal
    Maybe it's a little dumb, but i'm just not sure what is better. If i have to check about 30k rows in db for existanse, what i'd do? #1 - one query select id from table1 where name in (smth1,smth2...{till 30k}) #2 - many queries select id from table1 where name=smth1 Though, perfomance is not the goal, i don't want to go down with mysql either ;) Maybe, any other solutions will be more suitable... Thanks.

    Read the article

  • Can I make this two LINQ queries into one query only?

    - by Holli
    From a List of builtAgents I need all items with OptimPriority == 1 and only 5 items with OptimPriority == 0. I do this with two seperate queries but I wonder if I could make this with only one query. IEnumerable<Agent> priorityAgents = from pri in builtAgents where pri.OptimPriority == 1 select pri; IEnumerable<Agent> otherAgents = (from oth in builtAgents where oth.OptimPriority == 0 select oth).Take(5);

    Read the article

  • How to find the worst performing queries in MS SQL Server 2008?

    - by Thomas Bratt
    How to find the worst performing queries in MS SQL Server 2008? I found the following example but it does not seem to work: SELECT TOP 5 obj.name, max_logical_reads, max_elapsed_time FROM sys.dm_exec_query_stats a CROSS APPLY sys.dm_exec_sql_text(sql_handle) hnd INNER JOIN sys.sysobjects obj on hnd.objectid = obj.id ORDER BY max_logical_reads DESC Taken from: http://www.sqlservercurry.com/2010/03/top-5-costly-stored-procedures-in-sql.html

    Read the article

  • What is the difference between these 2 XML LINQ Queries?

    - by Jon
    I have the 2 following LINQ queries and I haven't quite got my head around LINQ so what is the difference between the 2 approaches below? Is there a circumstance where one approach is better than another? ChequeDocument.Descendants("ERRORS").Where(x=>(string)x.Attribute("D") == "").Count(); (from x in ChequeDocument.Descendants("ERRORS") where (string)x.Attribute("D") == "" select x).Count())

    Read the article

  • Can MySQL reasonably perform queries on billions of rows?

    - by haxney
    I am planning on storing scans from a mass spectrometer in a MySQL database and would like to know whether storing and analyzing this amount of data is remotely feasible. I know performance varies wildly depending on the environment, but I'm looking for the rough order of magnitude: will queries take 5 days or 5 milliseconds? Input format Each input file contains a single run of the spectrometer; each run is comprised of a set of scans, and each scan has an ordered array of datapoints. There is a bit of metadata, but the majority of the file is comprised of arrays 32- or 64-bit ints or floats. Host system |----------------+-------------------------------| | OS | Windows 2008 64-bit | | MySQL version | 5.5.24 (x86_64) | | CPU | 2x Xeon E5420 (8 cores total) | | RAM | 8GB | | SSD filesystem | 500 GiB | | HDD RAID | 12 TiB | |----------------+-------------------------------| There are some other services running on the server using negligible processor time. File statistics |------------------+--------------| | number of files | ~16,000 | | total size | 1.3 TiB | | min size | 0 bytes | | max size | 12 GiB | | mean | 800 MiB | | median | 500 MiB | | total datapoints | ~200 billion | |------------------+--------------| The total number of datapoints is a very rough estimate. Proposed schema I'm planning on doing things "right" (i.e. normalizing the data like crazy) and so would have a runs table, a spectra table with a foreign key to runs, and a datapoints table with a foreign key to spectra. The 200 Billion datapoint question I am going to be analyzing across multiple spectra and possibly even multiple runs, resulting in queries which could touch millions of rows. Assuming I index everything properly (which is a topic for another question) and am not trying to shuffle hundreds of MiB across the network, is it remotely plausible for MySQL to handle this? UPDATE: additional info The scan data will be coming from files in the XML-based mzML format. The meat of this format is in the <binaryDataArrayList> elements where the data is stored. Each scan produces = 2 <binaryDataArray> elements which, taken together, form a 2-dimensional (or more) array of the form [[123.456, 234.567, ...], ...]. These data are write-once, so update performance and transaction safety are not concerns. My naïve plan for a database schema is: runs table | column name | type | |-------------+-------------| | id | PRIMARY KEY | | start_time | TIMESTAMP | | name | VARCHAR | |-------------+-------------| spectra table | column name | type | |----------------+-------------| | id | PRIMARY KEY | | name | VARCHAR | | index | INT | | spectrum_type | INT | | representation | INT | | run_id | FOREIGN KEY | |----------------+-------------| datapoints table | column name | type | |-------------+-------------| | id | PRIMARY KEY | | spectrum_id | FOREIGN KEY | | mz | DOUBLE | | num_counts | DOUBLE | | index | INT | |-------------+-------------| Is this reasonable?

    Read the article

  • How optimize queries with fully qualified names in t-sql?

    - by tomaszs
    Whe I call: select * from Database.dbo.Table where NAME = 'cat' It takes: 200 ms And when I change database to Database in Management Studio and call it without fully qualified name it's much faster: select * from Table where NAME = 'cat' It takes: 17 ms Is there any way to make fully qualified queries faster without changing database?

    Read the article

  • How to combine these three sql queries into one?

    - by lam3r4370
    How to combine these two sql queries into one? SELECT DISTINCT * FROM rss WHERE MATCH(content,title) AGAINST ('$filter') SELECT COUNT(content) FROM rss WHERE MATCH(content,title) AGAINST ('$filters') And if the result is 0 from the above query - SELECT DISTINCT * FROM rss WHERE content LIKE '%$filters%' OR title LIKE '%$filters%'; $filter .= $row['filter']; $filters = $row['filter']; $filters may be more than one keyword

    Read the article

  • How to translate this 2 queries from Mysql to Postgresql? :

    - by xRobot
    How can I translate this 2 queries in postgresql ? : CREATE TABLE `example` ( `id` int(10) unsigned NOT NULL auto_increment, `from` varchar(255) NOT NULL default '0', `message` text NOT NULL, `lastactivity` timestamp NULL default '0000-00-00 00:00:00', `read` int(10) unsigned NOT NULL, PRIMARY KEY (`id`), KEY `from` (`from`) ) DEFAULT CHARSET=utf8; Query: SELECT * FROM table_1 LEFT OUTER JOIN table_2 ON ( table_1.id = table_2.id ) WHERE (table_1.lastactivity > NOW()-100);

    Read the article

  • Need help with 2 MySql Queries. Join vs Subqueries.

    - by BugBusterX
    I have 2 tables: user: id, name message: sender_id, receiver_id, message, read_at, created_at There are 2 results I need to retrieve and I'm trying to find the best solution. I have included queries that I'm using in the very end. I need to retrieve a list of users, and also with each user have information available whether there are any unread messages from each user (them as sender, me as receiver) and whether or not there are any read messages between us ( they send I'm receiver or I send they are receivers) I need Same as above, but only those members where there has been any messaging between us, sorted by unread first, then by last message received. Can you advise please? Should this be done with joins or subqueries? In first case I do not need the count, I just need to know whether or not there is at least one unread message. I'm posting code and my current queries, please have a look when you get a chance: BTW, everything is the way I want in fist query. My concern is: In second query I would like to order by messages.created_at, but I dont think I can do that with grouping? And also I dont know if this approach is the most optimized and fast. CREATE TABLE `user` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `name` varchar(255) NOT NULL, PRIMARY KEY (`id`) ) INSERT INTO `user` VALUES (1,'User 1'),(2,'User 2'),(3,'User 3'),(4,'User 4'),(5,'User 5'); CREATE TABLE `message` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `sender_id` bigint(20) DEFAULT NULL, `receiver_id` bigint(20) DEFAULT NULL, `message` text, `read_at` datetime DEFAULT NULL, `created_at` datetime NOT NULL, PRIMARY KEY (`id`) ) INSERT INTO `message` VALUES (1,3,1,'Messge',NULL,'2010-10-10 10:10:10'),(2,1,4,'Hey','2010-10-10 10:10:12','2010-10-10 10:10:11'),(3,4,1,'Hello','2010-10-10 10:10:19','2010-10-10 10:10:15'),(4,1,4,'Again','2010-10-10 10:10:25','2010-10-10 10:10:21'),(5,3,1,'Hiii',NULL,'2010-10-10 10:10:21'); SELECT u.*, m_new.id as have_new, m.id as have_any FROM user u LEFT JOIN message m_new ON (u.id = m_new.sender_id AND m_new.receiver_id = 1 AND m_new.read_at IS NULL) LEFT JOIN message m ON ((u.id = m.sender_id AND m.receiver_id = 1) OR (u.id = m.receiver_id AND m.sender_id = 1)) GROUP BY u.id SELECT u.*, m_new.id as have_new, m.id as have_any FROM user u LEFT JOIN message m_new ON (u.id = m_new.sender_id AND m_new.receiver_id = 1 AND m_new.read_at IS NULL) LEFT JOIN message m ON ((u.id = m.sender_id AND m.receiver_id = 1) OR (u.id = m.receiver_id AND m.sender_id = 1)) where m.id IS NOT NULL GROUP BY u.id

    Read the article

  • Which is more efficient in mysql, a big join or multiple queries of single table?

    - by Tom Greenpoint
    I have a mysql database like this Post – 500,000 rows (Postid,Userid) Photo – 200,000 rows (Photoid,Postid) About 50,000 posts have photos, average 4 each, most posts do not have photos. I need to get a feed of all posts with photos for a userid, average 50 posts each. Which approach would be more efficient? 1: Big Join select * from post left join photo on post.postid=photo.postid where post.userid=123 2: Multiple queries select * from post where userid=123 while (loop through rows) { select * from photo where postid=row[postid] }

    Read the article

  • How to write native SQL queries in Hibernate without hardcoding table names and fields?

    - by serg555
    Sometimes you have to write some of your queries in native SQL rather than hibernate HQL. Is there a nice way to avoid hardcoding table names and fields and get this data from existing mapping? For example instead of: String sql = "select user_name from tbl_user where user_id = :id"; something like: String sql = "select " + Hibernate.getFieldName("user.name") + " from " + Hibernate.getTableName(User.class) + " where " + Hibernate.getFieldName("user.id") + " = :id";

    Read the article

  • ASP.NET MVC <OutputCache> SqlDependency (CommandNotification?) with LINQ queries

    - by sinni800
    Hello, I use LINQ queries in my ASP.NET MVC application and want to use OutputCache in some of my Actions. I hear this should be possible with CommandNotifications. But those seem to only go for self-created SQLCommands, or am I wrong? Can I manually tell SQL server to send SQLDependency notifications if certain tables change? And if yes, how can I attach them to the OutputCache? Another side question: Can you do this with strongly types views too? Thank you in advance...

    Read the article

  • How can I generate an Expression tree that queries an object with List<T> as a property?

    - by David Robbins
    Forgive my clumsy explanation, but I have a class that contains a List: public class Document { public int OwnerId { get; set; } public List<User> Users { get; set; } public Document() { } } public class User { public string UserName { get; set; } public string Department { get; set; } } Currently I use PredicateBuilder to perform dynmica queries on my objects. How can I turn the following LINQ statement into an Expression Tree: var predicate= PredicateBuilder.True<User>(); predicate= predicate.And<User>(user => user.Deparment == "HR"); var deptDocs = documents.AsQueryable() .Where(doc => doc.Users .AsQueryable().Count(predicate) > 0) .ToList(); In other words var deptDocs = documents.HasUserAttributes("Department", "HR").ToList();

    Read the article

  • Trying to use VB to automate some queries. Running into what looks like a string problem

    - by Jeff
    Hi there I'm using MS Access 2003 and I'm trying to execute a few queries at once using VB. When I write out the query in SQL it works fine, but when I try to do it in VB it asks me to "Enter Parameter Value" for DEPA, then DND (which are the first few letters of a two strings I have). Here's the code: Option Compare Database Public Sub RemoveDupelicateDepartments() Dim oldID As String Dim newID As String Dim sqlStatement As String oldID = "DND-01" newID = "DEPA-04" sqlStatement = "UPDATE [Clean student table] SET [HomeDepartment]=" & newID & " WHERE [HomeDepartment]=" & oldID & ";" DoCmd.RunSQL sqlStatement & "" End Sub It looks to me as though it's taking in the string up to the - then nothing else. I dunno, that's why I'm asking lol. What should my code look like?

    Read the article

  • Trying to Unit Test A Class That Makes DB Queries Using Hibernate And Can't Get Session Created...

    - by Jared Michaels
    I am trying to implement JUnit tests for a class that performs DB queries using Hibernate. When I create the class under test, I get access to the session through the factory by doing the following: InitialContext context = new InitialContext(); sessionFactory = (SessionFactory) context.lookup(hibernateContext); This works fine when I deploy this to JBoss 5.1. I am trying to figure out how to get this to work with my JUnit test. I keep getting an exception stating that I "Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial". I've searched high and low but haven't been able to find any information about what specifically I need to do to get this to work. I am not using Spring or any frameworks, just plain old Java and JUnit.

    Read the article

< Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >