Search Results

Search found 4685 results on 188 pages for 'queries'.

Page 30/188 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >

  • Tracking 502 bad gateway error

    - by dasickle
    I moved my Wordpress site to WP Engine and now I constantly get 502 errors. I spoke with support and they said that its because I have a lot of DB queries. I ran some tests and my frontpage only has 95 queries and page size is about 500kb. Most inner pages are around 60 queries. All queries are very short. Some people tell me its common with WP Engine because they run nginx. Why do I keep getting these errors and is there a way to track how many of them happen on daily basis? P.S. WP Engine log is empty so cant see the 502's there.

    Read the article

  • Search For a Query in RDL Files with PowerShell

    - by AllenMWhite
    In tracking down poorly performing queries for clients I often encounter the query text in a trace file I've captured, but don't know the source of the query. I've found that many of the poorest performing queries are those written into the reports the business users need to make their decisions. If I can't figure out where they came from, usually years after the queries were written, I can't fix them. First thing I did was find a great utility called RSScripter , which opens up a Windows dialog...(read more)

    Read the article

  • Is it OK to try to use Plinq in all Linq queries?

    - by Tony_Henrich
    I read that PLinq will automatically use non parallel Linq if it finds PLinq to be more expensive. So I figured then why not use PLinq for everything (when possible) and let the runtime decide which one to use. The apps will be deployed to multicore servers and I am OK to develop a little more code to deal with parallelism. What are the pitfalls of using plinq as a default?

    Read the article

  • How can I merge two Linq IEnumerable<T> queries without running them?

    - by makerofthings7
    How do I merge a List<T> of TPL-based tasks for later execution? public async IEnumerable<Task<string>> CreateTasks(){ /* stuff*/ } My assumption is .Concat() but that doesn't seem to work: void MainTestApp() // Full sample available upon request. { List<string> nothingList = new List<string>(); nothingList.Add("whatever"); cts = new CancellationTokenSource(); delayedExecution = from str in nothingList select AccessTheWebAsync("", cts.Token); delayedExecution2 = from str in nothingList select AccessTheWebAsync("1", cts.Token); delayedExecution = delayedExecution.Concat(delayedExecution2); } /// SNIP async Task AccessTheWebAsync(string nothing, CancellationToken ct) { // return a Task } I want to make sure that this won't spawn any task or evaluate anything. In fact, I suppose I'm asking "what logically executes an IQueryable to something that returns data"? Background Since I'm doing recursion and I don't want to execute this until the correct time, what is the correct way to merge the results if called multiple times? If it matters I'm thinking of running this command to launch all the tasks var AllRunningDataTasks = results.ToList(); followed by this code: while (AllRunningDataTasks.Count > 0) { // Identify the first task that completes. Task<TableResult> firstFinishedTask = await Task.WhenAny(AllRunningDataTasks); // ***Remove the selected task from the list so that you don't // process it more than once. AllRunningDataTasks.Remove(firstFinishedTask); // TODO: Await the completed task. var taskOfTableResult = await firstFinishedTask; // Todo: (doen't work) TrustState thisState = (TrustState)firstFinishedTask.AsyncState; // TODO: Update the concurrent dictionary with data // thisState.QueryStartPoint + thisState.ThingToSearchFor Interlocked.Decrement(ref thisState.RunningDirectQueries); Interlocked.Increment(ref thisState.CompletedDirectQueries); if (thisState.RunningDirectQueries == 0) { thisState.TimeCompleted = DateTime.UtcNow; } }

    Read the article

  • LINQ-like or SQL-like DSL for end-users to run queries to select (not modify) data?

    - by Mark Rushakoff
    For a utility I'm working on, the client would like to be able to generate graphic reports on the data that has been collected. I can already generate a couple canned graphs (using ZedGraph, which is a very nice library); however, the utility would be much more flexible if the graphs were more programmable or configurable by the end-user. TLDR version I want users to be able to use something like SQL to safely extract and select data from a List of objects that I provide and can describe. What free tools or libraries will help me accomplish this? Full version I've given thought to using IronPython, IronRuby, and LuaInterface, but frankly they're all a bit overpowered for what I want to do. My classes are fairly simple, along the lines of: class Person: string Name; int HeightInCm; DateTime BirthDate; Weight[] WeighIns; class Weight: int WeightInKg; DateTime Date; Person Owner; (exact classes have been changed to protect the innocent). To come up with the data for the graph, the user will choose whether it's a bar graph, scatter plot, etc., and then to actually obtain the data, I would like to obtain some kind of List from the user simply entering something SQL-ish along the lines of SELECT Name, AVG(WeighIns) FROM People SELECT WeightInKg, Owner.HeightInCm FROM Weights And as a bonus, it would be nice if you could actually do operations as well: SELECT WeightInKg, (Date - Owner.BirthDate) AS Age FROM Weights The DSL doesn't have to be compliant SQL in any way; it doesn't even have to resemble SQL, but I can't think of a more efficient descriptive language for the task. I'm fine filling in blanks; I don't expect a library to do everything for me. What I would expect to exist (but haven't been able to find in any way, shape, or form) is something like Fluent NHibernate (which I am already using in the project) where I can declare a mapping, something like var personRequest = Request<Person>(); personRequest.Item("Name", (p => p.Name)); personRequest.Item("HeightInCm", (p => p.HeightInCm)); personRequest.Item("HeightInInches", (p => p.HeightInCm * CM_TO_INCHES)); // ... var weightRequest = Request<Weight>(); weightRequest.Item("Owner", (w => w.Owner), personRequest); // Indicate a chain to personRequest // ... var people = Table<Person>("People", GetPeopleFromDatabase()); var weights = Table<Weight>("Weights", GetWeightsFromDatabase()); // ... TryRunQuery(userInputQuery); LINQ is so close to what I want to do, but AFAIK there's no way to sandbox it. I don't want to expose any unnecessary functionality to the end user; meaning I don't want the user to be able to send in and process: from p in people select (p => { System.IO.File.Delete("C:\\something\\important"); return p.Name }) So does anyone know of any free .NET libraries that allow something like what I've described above? Or is there some way to sandbox LINQ? cs-script is close too, but it doesn't seem to offer sandboxing yet either. I'd be hesitant to expose the NHibernate interface either, as the user should have a read-only view of the data at this point in the usage. I'm using C# 3.5, and pure .NET solutions would be preferred. The bottom line is that I'm really trying to avoid writing my own parser for a subset of SQL that would only apply to this single project.

    Read the article

  • SELECT product from subclass: How many queries do I need?

    - by Stefano
    I am building a database similar to the one described here where I have products of different type, each type with its own attributes. I report a short version for convenience product_type ============ product_type_id INT product_type_name VARCHAR product ======= product_id INT product_name VARCHAR product_type_id INT -> Foreign key to product_type.product_type_id ... (common attributes to all product) magazine ======== magazine_id INT title VARCHAR product_id INT -> Foreign key to product.product_id ... (magazine-specific attributes) web_site ======== web_site_id INT name VARCHAR product_id INT -> Foreign key to product.product_id ... (web-site specific attributes) This way I do not need to make a huge table with a column for each attribute of different product types (most of which will then be NULL) How do I SELECT a product by product.product_id and see all its attributes? Do I have to make a query first to know what type of product I am dealing with and then, through some logic, make another query to JOIN the right tables? Or is there a way to join everything together? (if, when I retrieve the information about a product_id there are a lot of NULL, it would be fine at this point). Thank you

    Read the article

  • What is the best way pre filter user access for sqlalchemy queries?

    - by steve
    I have been looking at the sqlalchemy recipes on their wiki, but don't know which one is best to implement what I am trying to do. Every row on in my tables have an user_id associated with it. Right now, for every query, I queried by the id of the user that's currently logged in, then query by the criteria I am interested in. My concern is that the developers might forget to add this filter to the query (a huge security risk). Therefore, I would like to set a global filter based on the current user's admin rights to filter what the logged in user could see. Appreciate your help. Thanks.

    Read the article

  • nodejs, mongodb - How do I operate on data from multiple queries?

    - by Ryan
    Hi all, I'm new to JS in general, but I am trying to query some data from MongoDB. Basically, my first query retrieves information for the session with the specified session id. The second query does a simple geospacial query for documents with a location near the specified location. I'm using the mongodb-native javascript driver. All of these query methods return their results in callbacks, so they're non-blocking. This is the root of my troubles. What I'm needing to do is retrieve the results of the second query, and create an Array of sessionIds of all the returned documents. Then I'm going to pass those to a function later. But, I can't generate this array and use it anywhere outside the callback. Does anyone have any idea how to properly do this? http://dpaste.org/wXiE/

    Read the article

  • Are parametrized calls/sanitization/escaping characters necessary for hashed password fields in SQL queries?

    - by Computerish
    When writing a login system for a website, it is standard to use some combination of parameterized calls, sanitizing the user input, and/or escaping special characters to prevent SQL injection attacks. Any good login system, however, should also hash (and possibly salt) every password before it goes into an SQL query, so is it still necessary to worry about SQL injection attacks in passwords? Doesn't a hash completely eliminate any possibility of an SQL injection attack on its own?

    Read the article

  • Tool or library for end-users to run queries to select (not modify) data?

    - by Mark Rushakoff
    For a utility I'm working on, the client would like to be able to generate graphic reports on the data that has been collected. I can already generate a couple canned graphs (using ZedGraph, which is a very nice library); however, the utility would be much more flexible if the graphs were more programmable or configurable by the end-user. TLDR version I want users to be able to use something like SQL to safely extract and select data from a List of objects that I provide and can describe. What free tools or libraries will help me accomplish this? Full version I've given thought to using IronPython, IronRuby, and LuaInterface, but frankly they're all a bit overpowered for what I want to do. My classes are fairly simple, along the lines of: class Person: string Name; int HeightInCm; DateTime BirthDate; Weight[] WeighIns; class Weight: int WeightInKg; DateTime Date; Person Owner; (exact classes have been changed to protect the innocent). To come up with the data for the graph, the user will choose whether it's a bar graph, scatter plot, etc., and then to actually obtain the data, I would like to obtain some kind of List from the user simply entering something SQL-ish along the lines of SELECT Name, AVG(WeighIns) FROM People SELECT WeightInKg, Owner.HeightInCm FROM Weights And as a bonus, it would be nice if you could actually do operations as well: SELECT WeightInKg, (Date - Owner.BirthDate) AS Age FROM Weights The DSL doesn't have to be compliant SQL in any way; it doesn't even have to resemble SQL, but I can't think of a more efficient descriptive language for the task. I'm fine filling in blanks; I don't expect a library to do everything for me. What I would expect to exist (but haven't been able to find in any way, shape, or form) is something like Fluent NHibernate (which I am already using in the project) where I can declare a mapping, something like var personRequest = Request<Person>(); personRequest.Item("Name", (p => p.Name)); personRequest.Item("HeightInCm", (p => p.HeightInCm)); personRequest.Item("HeightInInches", (p => p.HeightInCm * CM_TO_INCHES)); // ... var weightRequest = Request<Weight>(); weightRequest.Item("Owner", (w => w.Owner), personRequest); // Indicate a chain to personRequest // ... var people = Table<Person>("People", GetPeopleFromDatabase()); var weights = Table<Weight>("Weights", GetWeightsFromDatabase()); // ... TryRunQuery(userInputQuery); LINQ is so close to what I want to do, but AFAIK there's no way to sandbox it. I don't want to expose any unnecessary functionality to the end user; meaning I don't want the user to be able to send in and process: from p in people select (p => { System.IO.File.Delete("C:\\something\\important"); return p.Name }) So does anyone know of any free .NET libraries that allow something like what I've described above? Or is there some way to sandbox LINQ? cs-script is close too, but it doesn't seem to offer sandboxing yet either. I'd be hesitant to expose the NHibernate interface either, as the user should have a read-only view of the data at this point in the usage. I'm using C# 3.5, and pure .NET solutions would be preferred. The bottom line is that I'm really trying to avoid writing my own parser for a subset of SQL that would only apply to this single project.

    Read the article

  • How do I make use of multiple cores in Large SQL Server Queries?

    - by Jonathan Beerhalter
    I have two SQL Servers, one for production, and one as an archive. Every night, we've got a SQL job that runs and copies the days production data over to the archive. As we've grown, this process takes longer and longer and longer. When I watch the utilization on the archive server running the archival process, I see that it only ever makes use of a single core. And since this box has eight cores, this is a huge waste of resources. The job runs at 3AM, so it's free to take any and all resources it can find. So what I need to do if figure out how to structure SQL Server jobs so they can take advantage of multiple cores, but I can't find any literature on tackling this problem. We're running SQL Server 2005, but I could certainly push for an upgrade if 2008 takes of this problem.

    Read the article

  • Are these 2 sql queries equivalent in all respects (e.g. estimated and actual execution plan)?

    - by Xerion
    Are query 1) == 2) in terms of estimated query plan AND actual plan? (can statistics affect the actual plan here, ever?) declare @cat int -- input param from prc ... 1) select * from A as a join B as b on b.id = a.id on b.cat = @cat join C as c on c.fid = b.fid on c.cat = @cat where a.cat = @cat 2) select * from A as a join B as b on b.id = a.id on b.cat = a.cat join C as c on c.fid = b.fid on c.cat = b.cat where a.cat = @cat It seems to me that these should logically be equivalent and the execution plan should always be the same regardless of actual difference in tables. And adding more conditions either in join, or where, or add more tables to join shouldn't change this. Are there cases this is not true?

    Read the article

  • DB fields not showing up in association custom queries?

    - by Kevin
    I have a notification list that the user can select different show options for different kinds of notifications along with how many results are returned. I'm keeping them in the user model because I want the custom sort to stay with the user between sessions. Here is the association in my user model: has_many :notifications, :class_name => "Notification", :foreign_key => "user_id", :conditions => ["won = ? and lost = ? and paid = ?", self.prefs_won, self.prefs_lost, self.prefs_paid], :limit => self.prefs_results.to_s But when I use the above code, Rails throws me an "unknown method" error for self.prefs_won. It is definitely a field in my database and set as a boolean value, but Rails can't find it... what's the problem?

    Read the article

  • How to make UPDATE queries in LINQ to SQL?

    - by Alex
    I like using LINQ to SQL. The only problem is that I don't like the default way of updating tables. Let's say I have the following table with the following columns: ID (primary key), value1, value2, value3, value4, value5 When I need to update something I call UPDATE ... WHERE ID=@id LINQ to SQL call UPDATE ... WHERE ID=@id and value1=@value1 and value2=@value2 and value3=@value3 and value4=@value4 and value5=@value5 I can override this behavior by adding UpdateCheck=UpdateCheck.Never to every column, but with every update of the DataContext class with the GUI, this will be erased. Is there any way to tell LINQ to use this way of updating data?

    Read the article

  • PHP function to handle most database queries has a problem with results. I am getting the right numb

    - by asdasds
    Here is my little function. It does not handle the results correctly. I do get all the rows that I want, but all the rows of the $results array contain the exact same values. So i make 2 arrays, a temporary array to hold the values after each fetch, and another array to hold all the temporary arrays. First i take the temp array and map its keys to the column names. Then i give it to bind_result, and call fetch() and use it like I would any other result value. Could this be because I re-use the $results array? numresults is the number of values you are taking from each row. if 0, you are not getting any results back. function db_query($db, $query, $params = NULL, $numresults = 0) { if($stmt = $db -> prepare($query)) { if($params != NULL) { call_user_func_array(array($stmt, 'bind_param'), $params); } if(!$stmt -> execute()) { //echo 'exec error:',$db->error; return false; } if($numresults > 0) { $results = array(); $tmpresult = array(); $meta = $stmt->result_metadata(); while ($columnName = $meta->fetch_field()) $tmpresult[] = &$results[$columnName->name]; call_user_func_array(array($stmt, 'bind_result'), $tmpresult); $meta->close(); $results = array(); while($stmt -> fetch()) $results[] = $tmpresult; } $stmt -> close(); } else { //echo 'prepare error: ',$db->error; return false; } if($numresults == 0) return true; return $results; }

    Read the article

  • Does it make sense to replace sub-queries by join?

    - by Roman
    For example I have a query like that. select col1 from t1 where col2>0 and col1 in (select col1 from t2 where col2>0) As far as I understand, I can replace it by the following query: select t1.col1 from t1 join (select col1 from t2 where col2>0) as t2 on t1.col1=t2.col1 where t1.col2>0 ADDED In some answers I see join in other inner join. Are both right? Or they are even identical?

    Read the article

  • Parallel MySQL queries for HTML table - WHILE(x or y)?

    - by Beti Chode
    I'm trying to create a table using PHP. What I need is a table with two columns. So I have an SQL table with 4 fields - primary key id, language, word and definition. The language for each is either Arabic or Russian. I want a table that does the following: | defintion | |____________________| | | | rus1 | arab1 | | rus2 | arab2 | | rus3 | arab3 | | rus4 | | So it divides the list by English word, creates a for each English word, then lists Russian equivalents in the left column and Arabic in the right. However there are often not the same number for both. What I am doing right now is running a WHILE loop in a WHILE loop. The outer loop is running fine but I think I am doing the inner loop wrong. Here is the bulk of the code: $definitions=mysql_query("SELECT DISTINCT definition FROM words") WHILE($row=mysql_fetch_array($definitions) { ECHO '<tr><th colspan="2">' . $row['definition'] . '</th></tr>'; $russian="SELECT * FROM words WHERE language='Russian' AND definition='".$row['definition']."'"; $arabic="SELECT * FROM words WHERE language='Arabic' AND definition='".$row['definition']."'"; WHILE($rus=mysql_fetch_array($russian) or $arb=mysql_fetch_array($arabic)) { ECHO '<tr><td>'.$rus['word'].'</td><td>'.$arb['word'].'</td></tr>'; } } Sadly I am getting soemthing like this: | defintion | |____________________| | | | rus1 | | | rus2 | | | rus3 | | | rus4 | | | | arab1 | | | arab2 | | | arab3 | Not sure what other way I can do this? I tried changing the or to || thinking the different precedence would cause another outcome, but then I get ONLY the Russian column. I'm out of ideas, you guys are my only hope!

    Read the article

  • Force creation of query execution plan

    - by Marc
    I have the following situation: .net 3.5 WinForm client app accessing SQL Server 2008 Some queries returning relatively big amount of data are used quite often by a form Users are using local SQL Express and restarting their machines at least daily Other users are working remotely over slow network connections The problem is that after a restart, the first time users open this form the queries are extremely slow and take more or less 15s on a fast machine to execute. Afterwards the same queries take only 3s. Of course this comes from the fact that no data is cached and must be loaded from disk first. My question: Would it be possible to force the loading of the required data in advance into SQL Server cache? Note My first idea was to execute the queries in a background worker when the application starts, so that when the user starts the form the queries will already be cached and execute fast directly. I however don't want to load the result of the queries over to the client as some users are working remotely or have otherwise slow networks. So I thought just executing the queries from a stored procedure and putting the results into temporary tables so that nothing would be returned. Turned out that some of the result sets are using dynamic columns so I couldn't create the corresponding temp tables and thus this isn't a solution. Do you happen to have any other idea?

    Read the article

  • How will Arel affect rails' includes() 's capabilities.

    - by Tim Snowhite
    I've looked over the Arel sources, and some of the activerecord sources for Rails 3.0, but I can't seem to glean a good answer for myself as to whether Arel will be changing our ability to use includes(), when constructing queries, for the better. There are instances when one might want to modify the conditions on an activerecord :include query in 2.3.5 and before, for the association records which would be returned. But as far as I know, this is not programmatically tenable for all :include queries: (I know some AR-find-includes make t#{n}.c#{m} renames for all the attributes, and one could conceivably add conditions to these queries to limit the joined sets' results; but others do n_joins + 1 number of queries over the id sets iteratively, and I'm not sure how one might hack AR to edit these iterated queries.) Will Arel allow us to construct ActiveRecord queries which specify the resulting associated model objects when using includes()? Ex: User :has_many posts( has_many :comments) User.all(:include => :posts) #say I wanted the post objects to have their #comment counts loaded without adding a comment_count column to `posts`. #At the post level, one could do so by: posts_with_counts = Post.all(:select => 'posts.*, count(comments.id) as comment_count', :joins => 'left outer join comments on comments.post_id = posts.id', :group_by => 'posts.id') #i believe #But it seems impossible to do so while linking these post objects to each #user as well, without running User.all() and then zippering the objects into #some other collection (ugly) #OR running posts.group_by(&:user) (even uglier, with the n user queries)

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >