Search Results

Search found 4815 results on 193 pages for 'parameterized queries'.

Page 130/193 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • How can I write a MySQL query to check multiple rows?

    - by Matt
    I have a MySQL table containing data on product features: feature_id feature_product_id feature_finder_id feature_text feature_status_yn 1 1 1 Webcam y 2 1 1 Speakers y 3 1 1 Bluray n I want to write a MySQL query that allows me to search for all products that have a 'y' feature_status_yn value for a given feature_product_id and return the feature_product_id. The aim is to use this as a search tool to allow me to filter results to product IDs only matching the requested feature set. A query of SELECT feature_id FROM product_features WHERE feature_finder_id = '1' AND feature_status_yn = 'y' will return all of the features of a given product. But how can I select all products (feature_product_id) that have a 'y' value when they are on separate lines? Multiple queries might be one way to do it, but I'm wondering whether there's a more elegant solution based purely in SQL.

    Read the article

  • Growing MS Access File Size problem

    - by user55886
    I have a large MS Access application with a lot of computations in VBA code. When I run it it eventually crashes due to excessive file size. There are a lot of intermediate tables and queries created and subsequently deleted, but Access does not reclaim the space. I have diligently closed all intermediate record sets and set all temporary objects to nothing, but nothing helps. The only way I can get my code to run is to run part of it, stop and repair/compress the file then restart the code. Isn't there a better way? Thanks

    Read the article

  • what this `^` mean here in solr

    - by Rahul Mehta
    I am confuse her but i want to clear my doubt. I think it is stupid question but i want to know. Use a TokenFilter that outputs two tokens (one original and one lowercased) for each input token. For queries, the client would need to expand any search terms containing upper case characters to two terms, one lowercased and one original. The original search term may be given a boost, although it may not be necessary given that a match on both terms will produce a higher score. text:NeXT ==> (text:NeXT^10 OR text:next) what this ^ mean here . http://wiki.apache.org/solr/SolrRelevancyCookbook#Relevancy_and_Case_Matching

    Read the article

  • Compact web server with Lua support?

    - by OverTheRainbow
    Hello, I need to find a very compact, cross-platform web server that can run Lua scripts, ie. either a regular web server like Mongoose that will forward queries to a Lua program in eg. FastCGI, or a web server itself written in Lua which will save the need to provide a separate web server. I recently started learning about Lua so am still in the dark about what is available out there, save for the three I came accross: Barracuda Embedded Web Server http://barracudaserver.com/ba/doc/ Xavante - Lua HTTP 1.1 Web server http://keplerproject.github.com/xavante/ Haserl http://haserl.sourceforge.net/ If someone's already done this recently, what solution would you recommend along with any tutorial/article that would get me started? Thank you.

    Read the article

  • How do I make this Query against EF Efficient?

    - by dudeNumber4
    Using EF4. Assume I have this: IQueryable<ParentEntity> qry = myRepository.GetParentEntities(); Int32 n = 1; What I want to do is this, but EF can't compare against null. qry.Where( parent => parent.Children.Where( child => child.IntCol == n ) != null ) What works is this, but the SQL it produces (as you would imagine) is pretty inefficient: qry.Where( parent => parent.Children.Where( child => child.IntCol == n ).FirstOrDefault().IntCol == n ) How can I do something like the first comparison to null that won't be generating nested queries and so forth?

    Read the article

  • jQuery AJAX & Multiple sp Result Sets

    - by Kevin
    Is it possible to use a stored procedure that returns multiple result sets in json format and process them as part of one request using ajax calls in jquery? In other words, I have a stored procedure that returns several result sets that are to be used with a series of select boxes that are all being filtered by the same criteria. If any of the select boxes is chosen that value is then passed to the stored procedure and all the subsequent select box updates reflect only results that match the filtered criteria. I don't want to have to call the same sp multiple times to process the results and was trying not to create multiple queries, so I'm wondering if it's possible to store more than one json result in a single request and then store and process them on the client side.

    Read the article

  • Rails: getting logic to run at end of request, regardless of filter chain aborts?

    - by JSW
    Is there a reliable mechanism discussed in rails documentation for calling a function at the end of the request, regardless of filter chain aborts? It's not after filters, because after filters don't get called if any prior filter redirected or rendered. For context, I'm trying to put some structured profiling/reporting information into the app log at the end of every request. This information is collected throughought the request lifetime via instance variables wrapped in custom controller accessors, and dumped at the end in a JSON blob for use by a post-processing script. My end goal is to generate reports about my application's logical query distribution (things that depend on controller logic, not just request URIs and parameters), performance profile (time spent in specific DB queries or blocked on webservices), failure rates (including invalid incoming requests that get rejected by before_filter validation rules), and a slew of other things that cannot really be parsed from the basic information in the application and apache logs. At a higher level, is there a different "rails way" that solves my app profiling goal?

    Read the article

  • PostgreSQL: How to index all foreign keys?

    - by biggusjimmus
    I am working with a large PostgreSQL database, and I are trying to tune it to get more performance. Our queries and updates seem to be doing a lot of lookups using foreign keys. What I would like is a relatively simple way to add Indexes to all of our foreign keys without having to go through every table (~140) and doing it manually. In researching this, I've come to find that there is no way to have Postgres do this for you automatically (like MySQL does), but I would be happy to hear otherwise there, too.

    Read the article

  • Inheritance in kohana

    - by Binaryrespawn
    Hi all, I have recently started to use Kohana and I know inheritance is in infancy stages at the moment. The work around is using a $_has_one annotation on the child class model. In may case i have "page" as the parent of "article". I have something like, protected $_has_one = array('mypage'=>array('model'=>'page', 'foreign_key'=>'id')); In my controller, I have an action which queries the database. In this query I am trying to access fields form the parent of "article" which is the "page". $n->articles=ORM::factory('article')->where('expires','=',0) ->where('articledate','<',date('y-m-d')) ->where('expirydate','>',date('y-m-d')) ->where('mypage->status','=','PUBLISHED') ->order_by('articledate','desc') ->find_all(); The status column resides in the page table and my query is generating an error to the effect of "cannot find status", clearly because it belongs to the parent. Any ideas ?

    Read the article

  • Including associations optimization in Rails

    - by Vitaly
    Hey, I'm looking for help with Ruby optimization regarding loading of associations on demand. This is simplified example. I have 3 models: Post, Comment, User. References are: Post has many comments and Comment has reference to User (:author). Now when I go to the post page, I expect to see post body + all comments (and their respective authors names). This requires following 2 queries: select * from Post -- to get post data (1 row) select * from Comment inner join User -- to get comment + usernames (N rows) In the code I have: Post.find(params[:id], :include => { :comments => [:author] } But it doesn't work as expected: as I see in the back end, there're still N+1 hits (some of them are cached though). How can I optimize that?

    Read the article

  • In MS Access is there a way to allow forms to update while maintaning Read Only

    - by Alex
    I have several forms linked tables via queries. The form pull data such as sales and ratios by selecting a product from the main's form's combo box. I am however having to issues: 1- I would ultimately prefer the combo box to be a free entry; however by just entering in the box and hitting enter (not a button called “enter on a screen” which would initiate recalcs, just normal enter), while it does bring the new information in sub-forms it also changes the information in the original table. If I make the table read only that it just doesn't allow the form to work by saying that the table is read only. 2- The same Read only issue occurs when another user with read only rights tries to use the database. I understand that ready only is functioning as intended, however I am wondering if there is way to make some functions work while disallowing the updating. I am unfortunately learning on the go, so go easy plz. Thank you

    Read the article

  • Multi join query returns to many results and improperly matched

    - by Woot4Moo
    I have the following minimal schema in Oracle: http://sqlfiddle.com/#!4/c1ed0/14 The queries I have run yield too many results and this query: select cat.*, status.*, source.* from cats cat, status status, source source Left OUTER JOIN source source2 on source2.sourceid = 1 Right OUTER JOIN status status2 on status2.isStray =0 order by cat.name will yield incorrect results. What I am expecting is a table that looks like the following however I cannot seem to come up with the correct SQL. NAME AGE LENGTH STATUSID CATSOURCE ISSTRAY SOURCEID CATID Adam 1 25 null null null 1 2 Bill 5 1 null null null null null Charles 7 5 null null null null null Steve 12 15 1 1 1 1 1 In plain English what I am looking for is to return all known cats + their associated cat source + their cat status while retaining null values. The only information I will have is the source that I am curious about. I also only want the cats that have a status of either STRAY or UNKNOWN (null)

    Read the article

  • Database indexes - what should they be

    - by WebweaverD
    Most of my database tables have a clear unique index through which lookups are done 90% of the time but I am a bit unsure on this one - I have a table which keeps track of user rating totals for items in my database, I now want to add another table, to track individual ratings with an ip address column to make sure no one can rate something twice. Since I can see this becoming a big, high use table it is important to optimize it correctly. (MYSQL table) This table will have the following fields: rating_id(always - unique), item_id (always - not unique), user_id (optional - not unique), ip_address (always - not unique), rating_value(always - not unique), has_review(bool) Now I envisions 90% the queries going something like this: When a user rates something - select where item_id = x and ip_address = y, (if rows = 0) insert rating When in user account pages - select where ip_address = x or username = y Now none of the fields searched on are unique, can I still use them as indexes (for example item _id and ip_address), can I have two indexes and will this still improve performance over a non indexed table?

    Read the article

  • Combining query rows in a loop

    - by icemanind
    I have the following ColdFusion 9 code: <cfloop from="1" to="#arrayLen(tagArray)#" index="i"> <cfquery name="qryGetSPFAQs" datasource="#application.datasource#"> EXEC searchFAQ '#tagArray[i]#' </cfquery> </cfloop> The EXEC executes a stored procedure on the database server, which returns rows of data, depending on what the parameter is. What I am trying to do is combine the queries into one query object. In other words, if it loops 3 times and each loop returns 4 rows, I want a query object that has all 12 rows in one object. How do I acheive this?

    Read the article

  • Parsing HTML with XPath and PHP

    - by Peter
    Is there a way (using XPath and PHP) to do the following (WITHOUT external XSLT files)? Remove all tables and their contents Remove everything after the first h1 tag Keep only paragraphs (INCLUDING their inner HTML (links, lists, etc)) I received an XSLT answer here, but I'm looking for XPATH queries that don't require external files. Currently, I've got the HTML in question loaded into a SimpleXmlElement via: $doc = @DOMDocument::loadHTML($xml); $data = simplexml_import_dom($doc); Now I need help with: $data = $data->xpath('??????'); Been working with this one for several days to no avail. I really appreciate the help. Edit: I don't particularly care what's inside the paragraphs, as I can use strip_tags to eliminate what I don't want. All I need to do is to isolate the paragraphs from the rest of the source. I suppose a more specific, accurate requirement would be this: Return only paragraphs (and their html contents) that aren't contained in tables, and only before the first h1 tag

    Read the article

  • Does normalization really hurt performance in high traffic sites?

    - by Luke101
    I am designing a database and I would like to normalize the database. I one query I will joining about 30-40 tables. Will this hurt the website performance if it ever becomes extremely popular? This will be the main query and it will be getting called 50% of the time. The other queries I will be joining about 2 tables. I have a choice right now to normalize or not to normalize but if the normalization becomes a problem in the future i may have to rewrite 40% of the software and it may take me a long time. Does normalization really hurt in this case? Should I denormalize now while I have the time?

    Read the article

  • MySQL Cluster data nodes - slow SELECTs

    - by Boyan Georgiev
    Hi to all. First off, I'm new to MySQL Cluster. This is my pain: I've managed to setup a MySQL Cluster with two data nodes, two SQL nodes and one management server. Everything works pretty well, except the following: my data nodes are spread across an intranet link which incurs latency into communications between the data nodes. Apparently, due to MySQL Cluster's internal partitioning schemes, when my PHP application pulls data from the cluster via SELECT queries, parts of the data are pulled from both data nodes. This makes the page appear onscreen REALLY slowly. If I bring one data node offline, the data can only be pulled from that single remaining data node, and thus, the final result (HTML output) appears on the screen in a very timely fashion. So, my question is this: can the data nodes/cluster be told to pull data from partitions stored only on a particular data node?

    Read the article

  • CakePHP: Using two tables for a single model

    - by mwaterous
    I'm just picking up development in CakePHP right now so forgive me if this seems obvious; it did to me when I first read about has, belongsTo, hasMany, etc. The problem is I would like to associate two tables with a single model, and was wondering if there was a way to configure this so that when CakePHP did it's queries it automatically performed a join on the two tables. I don't want to create a separate model for the second table as it is merely a meta information table - the master table will contain the primary information required, the meta table will be populated with secondary information that is not required and therefore may or may not be set for every row of the master table.

    Read the article

  • Insert record into mysql db with Entity Framework

    - by sanfra1983
    Hi, the problem is that it will insert a new record in a mysql table, I have already done the mapping of the mysql db and I have already done tests returning data and everything works. Now I read from a file, where there are queries written, I have them run me back and the result of true or false based on the final outcome of single query written to the file. Txt; I did this: using (var w = new demotestEntities ()) ( foreach (var l listaqueri) ( var p = we.CreateQuery <category> (l); we.SaveChanges (); result = true; ) ) but it does not work, I sense that it returns no errors, but neither the result given written in the query. txt file is as follows: INSERT INTO category (id, name) VALUES (null, 'test2') anyone can help me?

    Read the article

  • Dynamic query to immediate execute?

    - by Curtis White
    I am using the MSDN Dynamic linq to sql package. It allows using strings for queries. But, the returned type is an IQueryable and not an IQueryable<T>. I do not have the ToList() method. How can I this immediate execute without manually enumerating over the IQueryable? My goal is to databind to the Selecting event on a linqtosql datasource and that throws a datacontext disposed exception. I can set the query as the Datasource on a gridview though. Any help greatly appreciated! Thanks. The dynamic linq to sql is the one from the samples that comes with visual studio.

    Read the article

  • MySQL.. search using Fulltext or using Like? What is better?

    - by user156814
    I'm working on a search feature for my application, I want to search all articles in the database. As of now, I'm using a LIKE in my queries, but I want to add a "Related Articles" feature, sort of like what SO has in the sidebar (which I see as a problem if I use Like). What's better to use for MySQL searching, Fulltext or Like... or anything else I might not know about? Also, I'm using the Kohana Framework, so If anybody knows an easy way to do fulltext matching using the query builder, I'd appreciate that. Thanks.

    Read the article

  • Compiled query using list of class objects in C#

    - by Sukan
    Hello , Can somebody help me out in creating compiled queries where input is to be a list of class objects? I have seen examples where Func<DataContext, somematchobject, IQueryable<T>> is created and compiled. But can I do something like Func<List<T>, matchObject, T>, and compile it? Basically I want an object(T) meeting certain conditions (as in matchObject) to be returned from a list of objects(List<T>). Will CompiledQuery.Compile help me in this? Please help me experts!!

    Read the article

  • How to limit the number of connections to a SQL Server server from my tomcat deployed java applicati

    - by CJ
    I have an application that is deployed on tomcat on server A and sends queries to a huge variety of SQL Server databases on an server B. I am concerned that my application could overload this SQL Server database server and would like some way to preventing it making requests to connect to any database on that server if some arbitrary number of connections were already in existence and unclosed. I am looking at using connection pooling but am under the impression that this will only pool connections to a specific database on the SQL Server server, I want to control the total of these combined connections that will occur to many different databases (incidentally I can only find out the names of individual db's dynamically as they change day to day). Will connection pooling take care of this for me, are am I looking at this from the wrong perspective? I have no access to the configuration of the SQL Server server. Links to tutorials or working examples of your suggested solution are most welcome!

    Read the article

  • mysql does not utilize my cpu and ram enough?

    - by vick
    Hello Everyone! I am importing a 2.5gb csv file to a mysql table. My storage engine is innodb. Here is the script: use xxx; DROP TABLE IF EXISTS `xxx`.`xxx`; CREATE TABLE `xxx`.`xxx` ( `xxx_id` int(10) unsigned NOT NULL AUTO_INCREMENT, `name` varchar(128) NOT NULL, `yy` varchar(128) NOT NULL, `yyy` varchar(64) NOT NULL, `yyyy` varchar(2) NOT NULL, `yyyyy` varchar(10) NOT NULL, `url` varchar(64) NOT NULL, `p` varchar(10) NOT NULL, `pp` varchar(10) NOT NULL, `category` varchar(256) NOT NULL, `flag` varchar(4) NOT NULL, PRIMARY KEY (`xxx_id`) ) ENGINE=InnoDB DEFAULT CHARSET=latin1; set autocommit = 0; load data local infile '/home/xxx/raw.csv' into table company fields terminated by ',' optionally enclosed by '"' lines terminated by '\r\n' ( name, yy, yyy, yyyy, yyyyy, url, p, pp, category, flag ); commit; Why does my PC (core i7 920 with 6gb ram) only consume 9% cpu power and 60% ram when running these queries?

    Read the article

  • PHP Framework Benefits / Downfalls

    - by Lizard
    I have been a PHP developer for about 10 years now and until about a month ago I have never used a framework. The framework I am now using due to an existing codebase is cakePHP 1.2 I can see certain benefits of the frameworks with the basic helpers like default layouts. I can deffinately seen the benefits of MVC keeping the logic sperate etc. But the query building just seems to be bloated. Is this expected? Am I likely to be able to build better queries than the framework could build? I just feel I could get my apps running better without a framework. What are your thoughts?

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >