Search Results

Search found 33316 results on 1333 pages for 'sql team'.

Page 598/1333 | < Previous Page | 594 595 596 597 598 599 600 601 602 603 604 605  | Next Page >

  • Common Table Expressions slow when using a table variable

    - by Phil Haselden
    I have been experimenting with the following (simplified) CTE. When using a table variable () the query runs for minutes before I cancel it. Any of the other commented out methods return in less than a second. If I replace the whole WHERE clause with an INNER JOIN it is fast as well. Any ideas why using a table variable would run so slowly? FWIW: The database contains 2.5 million records and the inner query returns 2 records. CREATE TABLE #rootTempTable (RootID int PRIMARY KEY) INSERT INTO #rootTempTable VALUES (1360); DECLARE @rootTableVar TABLE (RootID int PRIMARY KEY); INSERT INTO @rootTableVar VALUES (1360); WITH My_CTE AS ( SELECT ROW_NUMBER() OVER(ORDER BY d.DocumentID) rownum, d.DocumentID, d.Title FROM [Document] d WHERE d.LocationID IN ( SELECT LocationID FROM Location JOIN @rootTableVar rtv ON Location.RootID = rtv.RootID -- VERY SLOW! --JOIN #rootTempTable tt ON Location.RootID = tt.RootID -- Fast --JOIN (SELECT 1360 as RootID) AS rt ON Location.RootID = rt.RootID -- Fast --WHERE RootID = 1360 -- Fast ) ) SELECT * FROM My_CTE WHERE (rownum > 0) AND (rownum <= 100) ORDER BY rownum

    Read the article

  • Sharepoint/WSS Reporting Services Integration woes

    - by mhollers
    after a number of failed attempts i seem to have successfully installed the Reporting services add-in to my WSS farm. However, I seem to be missing most of the enhanced functionality eg no report library template, no report center site template. the only additional functionality available is the report viewer web part. background: 2 server WSS 3.0 farm with CA (Central admin) WFE (web front end) and reporting services addin installed on 1, and SQL05 SP2 with Reporting services (RS) and all databases installed on other. I have a VM environment set up and have rolled this back and repeated a number of times. I have configured RS within CA and activated 'Report Server INtegration Feature'. Within the 'site settings' I have a 'Reporting Services' heading with a 'manage shared schedules' item/link, not sure if there should be other options? I was of the understanding that to view reports within sharepoint i could either create a new site using the 'report center' template or add a report library to an existing site, neither of which seems available I am at a loss as to what to do, as all online information seems to do with dealing with installation issues/errors, which i seem to have eventually got past

    Read the article

  • How can I extend DynamicQuery.cs to implement a .Single method?

    - by Yoenhofen
    I need to write some dynamic queries for a project I'm working on. I'm finding out that a significant amount of time is being spent by my program on the Count and First methods, so I started to change to .Single, only to find out that there is no such method. The code below was my first attempt at creating one (mostly copied from the Where method), but it's not working. Help? public static object Single(this IQueryable source, string predicate, params object[] values) { if (source == null) throw new ArgumentNullException("source"); if (predicate == null) throw new ArgumentNullException("predicate"); LambdaExpression lambda = DynamicExpression.ParseLambda(source.ElementType, typeof(bool), predicate, values); return source.Provider.CreateQuery( Expression.Call( typeof(Queryable), "Single", new Type[] { source.ElementType }, source.Expression, Expression.Quote(lambda))); }

    Read the article

  • Dealing with huge SQL resultset

    - by Dave McClelland
    I am working with a rather large mysql database (several million rows) with a column storing blob images. The application attempts to grab a subset of the images and runs some processing algorithms on them. The problem I'm running into is that, due to the rather large dataset that I have, the dataset that my query is returning is too large to store in memory. For the time being, I have changed the query to not return the images. While iterating over the resultset, I run another select which grabs the individual image that relates to the current record. This works, but the tens of thousands of extra queries have resulted in a performance decrease that is unacceptable. My next idea is to limit the original query to 10,000 results or so, and then keep querying over spans of 10,000 rows. This seems like the middle of the road compromise between the two approaches. I feel that there is probably a better solution that I am not aware of. Is there another way to only have portions of a gigantic resultset in memory at a time? Cheers, Dave McClelland

    Read the article

  • ODBC Connection String Problem

    - by Brett
    Hi there, I am having major trouble connecting to my database via ODBC. The db is local (but I have a mirror on a virtual machine), so I am trying to use the connectionstring: Dsn=MonetDB;host=TARBELL where TARBELL is the name of my computer. However, it doesn't connect. BUT, this string does: Dsn=MonetDB;host=localhost as does Dsn=MonetDB Can anyone explain this? I am at a complete loss. I have taken down my firewalls (at least until I get this figured out), so that can't be the problem. I eventually want to change the TARBELL to the mirrored virtual machine running another instance of the database. Many thanks, Brett

    Read the article

  • How to get the millisecond value from a Timestamp field in firebird with Delphi 2007

    - by Re0sless
    I have a Firebird database (running on server version 2.1.3) and am connecting to it with Delphi 2007 using the DBExpress objects (using the Interbase driver) One of my tables in the database looks something like this CREATE TABLE MYTABLE ( MYDATE Timestamp NOT NULL, MYINDEX Integer NOT NULL, ... Snip ... PRIMARY KEY (MYDATE ,MYINDEX) ); I can add to the table OK, and in Flame Robin it shows the timestamp field as having a millisecond value. But when I do a select all (select * from MYTABLE) on the table I can not get the millisecond value, as it is always returned as 000. This causes major problems as it is part of the primary key (unfortunately I didn't design the table and don't have authority to change it). I have tried the following to get the millisecond value: sql1.fieldbyname('MYDATE').AsDateTime; sql1.fieldbyname('MYDATE').AsSQLTimeStamp; sql1.fieldbyname('MYDATE').AsStirng; sql1.fieldbyname('MYDATE').AsFloat; But they all return 14/09/2009 14:25:06.000 when formatted. How do I retrieve the millisecond from a timestamp? UPDATE: In case this helps anyone in the future, here are the drivers I tried for DBExpress and the results. Embarcadero - dbExpress Driver for Firebird (Delphi 2010 Trial Version) - Milliseconds not supported in timestamps. Chau Chee Yang's - dbExpress Driver for Firebird (Delphi 2007) - Milliseconds not supported in timestamps. UpScene - InterXpress for Firebird (Delphi 2007) - Milliseconds are supported in timestamps. DevArt - dbExpress Driver for InterBase (Delphi 2007) - Milliseconds are supported in timestamps.

    Read the article

  • RDLC: How to print multiple tables in one report

    - by Eduardo Molteni
    I'm generating the RDLC XML schema and showing the report in the ReportViewer control. No problems there. Now, I want a report with 2 tables, with 2 differents dataset. Something like this gets generated: <Body> <ReportItems> <Table Name="Table1"> .... </Table> <Table Name="Table2"> .... </Table> </ReportItems> </Body> But, when printed, both tables start from the top, printing one table over the other (not nice) Is there a way to tell that Table2 should start after Table1? Update: I've tried with List with a fake DataSource, but it does not work.

    Read the article

  • SSIS Send Mail task limited to 255 characters in address?

    - by CodeByMoonlight
    Is this a bug, or some hidden limit I can't find any documentation about? When creating a Send Mail Task in SSIS 2008, the TO, CC and BCC fields seem to have a hidden limit of 255 characters. I'm aware this is the standard limit for individual email addresses, but all three are commonly used for multiple addresses and the comment for the To field even says "separate the recipients with semicolons". But nevertheless, it truncates the address to a maximum of 255 characters. Bug, non-obvious standard, or something I'm missing? Any way around this? We were trying to build a CC list dynamically, but this has caused a rethink.

    Read the article

  • Database design for summarized data

    - by holden
    I have a new table I'm going to add to a bunch of other summarized data, basically to take some of the load off by calculating weekly avgs. My question is whether I would be better off with one model over the other. One model with days of the week as a column with an additional column for price or another model as a series of fields for the DOW each taking a price. I'd like to know which would save me in speed and/or headaches? Or at least the trade off. IE. ID OBJECT_ID MON TUE WED THU FRI SAT SUN SOURCE OR ID OBJECT_ID DAYOFWEEK PRICE SOURCE

    Read the article

  • ibatis isNotEmpty with multiple variables

    - by hibernate
    Suppose I have a massive table called inactiveUsers and a search form. I want to conditionally join the inactiveUsers table if any user related characteristic is chosen (address, name, phoneNumber, etc...). Is there any way to do this without the following: <isNotEmpty property="address">JOIN inactiveUsers</isNotEmpty> <isNotEmpty property="phoneNumber">JOIN inactiveUsers</isNotEmpty> <isNotEmpty property="name">JOIN inactiveUsers</isNotEmpty> and so on for another 10-20 isNotEmpty clauses. I would like to do something like: <isAnyNotEmpty properties="address, phoneNumber, name, ....">JOIN inactiveUsers</isNotEmpty> Is this possible with ibatis? If so, how?

    Read the article

  • Silverlight: After adding table no datacontext classes are available for the new DomainService class

    - by Gabriel
    I'm using a Silverlight 4 WCF RIA Services demo application that uses LinqToSql, that works well. I add a new database table, move the new table to the LinqToSql designer and build the project I add a new DomainService class I get a dialog with the only option to create an empty DomainService and no DataContext classes are available. What am I missed? Thanks in advance Gabriel

    Read the article

  • Check username and password in LINQ query

    - by b0x0rz
    this linq query var users = from u in context.Users where u.UserEMailAdresses.Any(e1 => e1.EMailAddress == userEMail) && u.UserPasswords.Any(e2 => e2.PasswordSaltedHash == passwordSaltedHash) select u; return users.Count(); returns: 1 even when there is nothing in password table. how come? what i am trying to do is get the values of email and passwordHash from two separate tables (UserEMailAddresses and UserPasswords) linked via foreign keys to the third table (Users). it should be simple - checking if email and password mach from form to database. but it is not working for me. i get 1 (for count) even when there are NO entries in the UserPasswords table. is the linq query above completely wrong, or...?

    Read the article

  • C#: Fill DataGridView From Anonymous Linq Query

    - by mdvaldosta
    // From my form BindingSource bs = new BindingSource(); private void fillStudentGrid() { bs.DataSource = Admin.GetStudents(); dgViewStudents.DataSource = bs; } // From the Admin class public static List<Student> GetStudents() { DojoDBDataContext conn = new DojoDBDataContext(); var query = (from s in conn.Students select new Student { ID = s.ID, FirstName = s.FirstName, LastName = s.LastName, Belt = s.Belt }).ToList(); return query; } I'm trying to fill a datagridview control in Winforms, and I only want a few of the values. The code compiles, but throws a runtime error: Explicit construction of entity type 'DojoManagement.Student' in query is not allowed. Is there a way to get it working in this manner?

    Read the article

  • Error using Dynamic Data Filtering: missing datasource

    - by sebastiaan
    I am trying to use the ASP.NET Dynamic Data Filtering project, but I'm running into a problem during the configuration. I'm following the instructions on the author's blog, and everything works like described. Then it tells me to change the datasource using the designer view. I am told to select the "GridDataSource" in the "Configure data source" wizard. This option is not there though. I get all of the classes in my project, including the DataContext that was generated by Linq. When I choose "Show only DataContext objects", the dropdown ("Choose your context object:") is completely empty. When I turn of the checkbox and choose my DataContext class, I get asked which table I want and all that. But, as the whole purpose of a Dynamic Data site is NOT to use one single table, that's not much help. So I've looked at the instructions again and copied the resulting datasource from the example: <asp:DynamicLinqDataSource ID="GridDataSource" runat="server" EnableDelete="True" EnableUpdate="True"></asp:DynamicLinqDataSource> Which is exactly what I had, without the "WhereParameters" nodes in there. Now, when I run the list page however, I get an exception about a missing datasource from the filtering component. Of course when I remove the DynamicFilterRepeater, it works again. This is the meat of the exception: [InvalidOperationException: Missing DataSource] Catalyst.Web.DynamicData.DynamicFilterRepeater.GetTable() in D:\Catalyst\Projects\DynamicData\Project\Trunk\DynamicData\DynamicData\DynamicFilterRepeater.cs:74 Catalyst.Web.DynamicData.DynamicFilterRepeater.GetFilters() in D:\Catalyst\Projects\DynamicData\Project\Trunk\DynamicData\DynamicData\DynamicFilterRepeater.cs:81 Catalyst.Web.DynamicData.DynamicFilterRepeater.OnInit(EventArgs e) in D:\Catalyst\Projects\DynamicData\Project\Trunk\DynamicData\DynamicData\DynamicFilterRepeater.cs:106 How do I make the DynamicFilterRepeater recognize my datasource? I'm using VS2010 Pro, on a Win7 machine.

    Read the article

  • Wordpress SQL Select Multiple Meta Values / Meta Keys / Custom Fields

    - by Wes
    I am trying to modify a wordpress / MySQL function to display a little more information. I'm currently running the following query that selects the post, joins the 'postmeta' and gets the info where the meta_key = _liked function most_liked_posts($numberOf, $before, $after, $show_count) { global $wpdb; $request = "SELECT ID, post_title, meta_value FROM $wpdb->posts, $wpdb->postmeta"; $request .= " WHERE $wpdb->posts.ID = $wpdb->postmeta.post_id"; $request .= " AND post_status='publish' AND post_type='post' AND meta_key='_liked' "; $request .= " ORDER BY $wpdb->postmeta.meta_value+0 DESC LIMIT $numberOf"; $posts = $wpdb->get_results($request); foreach ($posts as $post) { $post_title = stripslashes($post->post_title); $permalink = get_permalink($post->ID); $post_count = $post->meta_value; echo $before.'<a href="' . $permalink . '" title="' . $post_title.'" rel="nofollow">' . $post_title . '</a>'; echo $show_count == '1' ? ' ('.$post_count.')' : ''; echo $after; } } The important part being: $post_count = $post->meta_value; But now I want to also grab a value that is attached to each post called wbphoto How do I specify that $post_count = _liked and $photo = wbphoto ? Here is a screen cap of my Phpmyadmin

    Read the article

  • Binding a DropDownList in ListView InsertItemTemplate throwing an error

    - by Telos
    I've got a ListView which binds to a LinqDataSource and displays selected locations. The insert item Contains a dropdownlist that pulls from another LinqDataSource to give all the unselected locations. The problem is that I get the following error when loading the page: Databinding methods such as Eval(), XPath(), and Bind() can only be used in the context of a databound control. I'm doing a very similar setup in another page of the website, and it isn't giving us this error so I'm pretty confused. I know I can work around this by not binding, manually finding the control and getting the value, but this should work and I don't understand why it isn't. Any thoughts? The better part of the source code is below. <asp:LinqDataSource ID="ldsLocations" runat="server" ContextTypeName="ClearviewInterface.ESLinqDataContext" EnableDelete="true" EnableInsert="true" OnInserting="ldsLocations_Inserting" OnDeleting="ldsLocations_Deleting" TableName="crmLocations" OrderBy="addr1" OnSelecting="ldsLocations_Selecting" /> <asp:LinqDataSource ID="ldsFreeLocations" runat="server" ContextTypeName="ClearviewInterface.ESLinqDataContext" OrderBy="addr1" TableName="v_CVLocations" OnSelecting="ldsFreeLocations_Selecting" /> <asp:ListView ID="lvLocations" DataSourceID="ldsLocations" DataKeyNames="ID" InsertItemPosition="LastItem" runat="server" > <InsertItemTemplate> <tr> <td colspan="6"><hr /></td> </tr> <tr> <td colspan="2"> <asp:DropDownList ID="ddlFreeLocations" DataSourceID="ldsFreeLocations" DataTextField="addr1" DataValueField="record" MarkFirstMatch="true" SelectedValue='<%# Bind("record") %>' runat="server" /> </td> <td><asp:ImageButton ID="btnAdd" CommandName="Insert" SkinID="Insert" runat="server" /></td> </tr> </InsertItemTemplate>

    Read the article

  • Mulitple full joins in Postgres is slow

    - by blast83
    I have a program to use the IMDB database and am having very slow performance on my query. It appears that it doesn't use my where condition until after it materializes everything. I looked around for hints to use but nothing seems to work. Here is my query: SELECT * FROM name as n1 FULL JOIN aka_name ON n1.id = aka_name.person_id FULL JOIN cast_info as t2 ON n1.id = t2.person_id FULL JOIN person_info as t3 ON n1.id = t3.person_id FULL JOIN char_name as t4 ON t2.person_role_id = t4.id FULL JOIN role_type as t5 ON t2.role_id = t5.id FULL JOIN title as t6 ON t2.movie_id = t6.id FULL JOIN aka_title as t7 ON t6.id = t7.movie_id FULL JOIN complete_cast as t8 ON t6.id = t8.movie_id FULL JOIN kind_type as t9 ON t6.kind_id = t9.id FULL JOIN movie_companies as t10 ON t6.id = t10.movie_id FULL JOIN movie_info as t11 ON t6.id = t11.movie_id FULL JOIN movie_info_idx as t19 ON t6.id = t19.movie_id FULL JOIN movie_keyword as t12 ON t6.id = t12.movie_id FULL JOIN movie_link as t13 ON t6.id = t13.linked_movie_id FULL JOIN link_type as t14 ON t13.link_type_id = t14.id FULL JOIN keyword as t15 ON t12.keyword_id = t15.id FULL JOIN company_name as t16 ON t10.company_id = t16.id FULL JOIN company_type as t17 ON t10.company_type_id = t17.id FULL JOIN comp_cast_type as t18 ON t8.status_id = t18.id WHERE n1.id = 2003 Very table is related to each other on the join via foreign-key constraints and have indexes for all the mentioned columns. The query plan details: "Hash Left Join (cost=5838187.01..13756845.07 rows=15579622 width=835) (actual time=146879.213..146891.861 rows=20 loops=1)" " Hash Cond: (t8.status_id = t18.id)" " -> Hash Left Join (cost=5838185.92..13542624.18 rows=15579622 width=822) (actual time=146879.199..146891.833 rows=20 loops=1)" " Hash Cond: (t10.company_type_id = t17.id)" " -> Hash Left Join (cost=5838184.83..13328403.29 rows=15579622 width=797) (actual time=146879.165..146891.781 rows=20 loops=1)" " Hash Cond: (t10.company_id = t16.id)" " -> Hash Left Join (cost=5828372.95..10061752.03 rows=15579622 width=755) (actual time=146426.483..146429.756 rows=20 loops=1)" " Hash Cond: (t12.keyword_id = t15.id)" " -> Hash Left Join (cost=5825164.23..6914088.45 rows=15579622 width=731) (actual time=146372.411..146372.529 rows=20 loops=1)" " Hash Cond: (t13.link_type_id = t14.id)" " -> Merge Left Join (cost=5825162.82..6699867.24 rows=15579622 width=715) (actual time=146372.366..146372.472 rows=20 loops=1)" " Merge Cond: (t6.id = t13.linked_movie_id)" " -> Merge Left Join (cost=5684009.29..6378956.77 rows=15579622 width=699) (actual time=144019.620..144019.711 rows=20 loops=1)" " Merge Cond: (t6.id = t12.movie_id)" " -> Merge Left Join (cost=5182403.90..5622400.75 rows=8502523 width=687) (actual time=136849.731..136849.809 rows=20 loops=1)" " Merge Cond: (t6.id = t19.movie_id)" " -> Merge Left Join (cost=4974472.00..5315778.48 rows=8502523 width=637) (actual time=134972.032..134972.099 rows=20 loops=1)" " Merge Cond: (t6.id = t11.movie_id)" " -> Merge Left Join (cost=1830064.81..2033131.89 rows=1341632 width=561) (actual time=63784.035..63784.062 rows=2 loops=1)" " Merge Cond: (t6.id = t10.movie_id)" " -> Nested Loop Left Join (cost=1417360.29..1594294.02 rows=1044480 width=521) (actual time=59279.246..59279.264 rows=1 loops=1)" " Join Filter: (t6.kind_id = t9.id)" " -> Merge Left Join (cost=1417359.22..1429787.34 rows=1044480 width=507) (actual time=59279.222..59279.224 rows=1 loops=1)" " Merge Cond: (t6.id = t8.movie_id)" " -> Merge Left Join (cost=1405731.84..1414378.65 rows=1044480 width=491) (actual time=59121.773..59121.775 rows=1 loops=1)" " Merge Cond: (t6.id = t7.movie_id)" " -> Sort (cost=1346206.04..1348817.24 rows=1044480 width=416) (actual time=58095.230..58095.231 rows=1 loops=1)" " Sort Key: t6.id" " Sort Method: quicksort Memory: 17kB" " -> Hash Left Join (cost=172406.29..456387.53 rows=1044480 width=416) (actual time=57969.371..58095.208 rows=1 loops=1)" " Hash Cond: (t2.movie_id = t6.id)" " -> Hash Left Join (cost=104700.38..256885.82 rows=1044480 width=358) (actual time=49981.493..50006.303 rows=1 loops=1)" " Hash Cond: (t2.role_id = t5.id)" " -> Hash Left Join (cost=104699.11..242522.95 rows=1044480 width=343) (actual time=49981.441..50006.250 rows=1 loops=1)" " Hash Cond: (t2.person_role_id = t4.id)" " -> Hash Left Join (cost=464.96..12283.95 rows=1044480 width=269) (actual time=0.071..0.087 rows=1 loops=1)" " Hash Cond: (n1.id = t3.person_id)" " -> Nested Loop Left Join (cost=0.00..49.39 rows=7680 width=160) (actual time=0.051..0.066 rows=1 loops=1)" " -> Nested Loop Left Join (cost=0.00..17.04 rows=3 width=119) (actual time=0.038..0.041 rows=1 loops=1)" " -> Index Scan using name_pkey on name n1 (cost=0.00..8.68 rows=1 width=39) (actual time=0.022..0.024 rows=1 loops=1)" " Index Cond: (id = 2003)" " -> Index Scan using aka_name_idx_person on aka_name (cost=0.00..8.34 rows=1 width=80) (actual time=0.010..0.010 rows=0 loops=1)" " Index Cond: ((aka_name.person_id = 2003) AND (n1.id = aka_name.person_id))" " -> Index Scan using cast_info_idx_pid on cast_info t2 (cost=0.00..10.77 rows=1 width=41) (actual time=0.011..0.020 rows=1 loops=1)" " Index Cond: ((t2.person_id = 2003) AND (n1.id = t2.person_id))" " -> Hash (cost=463.26..463.26 rows=136 width=109) (actual time=0.010..0.010 rows=0 loops=1)" " -> Index Scan using person_info_idx_pid on person_info t3 (cost=0.00..463.26 rows=136 width=109) (actual time=0.009..0.009 rows=0 loops=1)" " Index Cond: (person_id = 2003)" " -> Hash (cost=42697.62..42697.62 rows=2442362 width=74) (actual time=49305.872..49305.872 rows=2442362 loops=1)" " -> Seq Scan on char_name t4 (cost=0.00..42697.62 rows=2442362 width=74) (actual time=14.066..22775.087 rows=2442362 loops=1)" " -> Hash (cost=1.12..1.12 rows=12 width=15) (actual time=0.024..0.024 rows=12 loops=1)" " -> Seq Scan on role_type t5 (cost=0.00..1.12 rows=12 width=15) (actual time=0.012..0.014 rows=12 loops=1)" " -> Hash (cost=31134.07..31134.07 rows=1573507 width=58) (actual time=7841.225..7841.225 rows=1573507 loops=1)" " -> Seq Scan on title t6 (cost=0.00..31134.07 rows=1573507 width=58) (actual time=21.507..2799.443 rows=1573507 loops=1)" " -> Materialize (cost=59525.80..63203.88 rows=294246 width=75) (actual time=812.376..984.958 rows=192075 loops=1)" " -> Sort (cost=59525.80..60261.42 rows=294246 width=75) (actual time=812.363..922.452 rows=192075 loops=1)" " Sort Key: t7.movie_id" " Sort Method: external merge Disk: 24880kB" " -> Seq Scan on aka_title t7 (cost=0.00..6646.46 rows=294246 width=75) (actual time=24.652..164.822 rows=294246 loops=1)" " -> Materialize (cost=11627.38..12884.43 rows=100564 width=16) (actual time=123.819..149.086 rows=41907 loops=1)" " -> Sort (cost=11627.38..11878.79 rows=100564 width=16) (actual time=123.807..138.530 rows=41907 loops=1)" " Sort Key: t8.movie_id" " Sort Method: external merge Disk: 3136kB" " -> Seq Scan on complete_cast t8 (cost=0.00..1549.64 rows=100564 width=16) (actual time=0.013..10.744 rows=100564 loops=1)" " -> Materialize (cost=1.08..1.15 rows=7 width=14) (actual time=0.016..0.029 rows=7 loops=1)" " -> Seq Scan on kind_type t9 (cost=0.00..1.07 rows=7 width=14) (actual time=0.011..0.013 rows=7 loops=1)" " -> Materialize (cost=412704.52..437969.09 rows=2021166 width=40) (actual time=3420.356..4278.545 rows=1028995 loops=1)" " -> Sort (cost=412704.52..417757.43 rows=2021166 width=40) (actual time=3420.349..3953.483 rows=1028995 loops=1)" " Sort Key: t10.movie_id" " Sort Method: external merge Disk: 90960kB" " -> Seq Scan on movie_companies t10 (cost=0.00..35214.66 rows=2021166 width=40) (actual time=13.271..566.893 rows=2021166 loops=1)" " -> Materialize (cost=3144407.19..3269057.42 rows=9972019 width=76) (actual time=65485.672..70083.219 rows=5039009 loops=1)" " -> Sort (cost=3144407.19..3169337.23 rows=9972019 width=76) (actual time=65485.667..68385.550 rows=5038999 loops=1)" " Sort Key: t11.movie_id" " Sort Method: external merge Disk: 735512kB" " -> Seq Scan on movie_info t11 (cost=0.00..212815.19 rows=9972019 width=76) (actual time=15.750..15715.608 rows=9972019 loops=1)" " -> Materialize (cost=207925.01..219867.92 rows=955433 width=50) (actual time=1483.989..1785.636 rows=429401 loops=1)" " -> Sort (cost=207925.01..210313.59 rows=955433 width=50) (actual time=1483.983..1654.165 rows=429401 loops=1)" " Sort Key: t19.movie_id" " Sort Method: external merge Disk: 31720kB" " -> Seq Scan on movie_info_idx t19 (cost=0.00..15047.33 rows=955433 width=50) (actual time=7.284..221.597 rows=955433 loops=1)" " -> Materialize (cost=501605.39..537645.64 rows=2883220 width=12) (actual time=5823.040..6868.242 rows=1597396 loops=1)" " -> Sort (cost=501605.39..508813.44 rows=2883220 width=12) (actual time=5823.026..6477.517 rows=1597396 loops=1)" " Sort Key: t12.movie_id" " Sort Method: external merge Disk: 78888kB" " -> Seq Scan on movie_keyword t12 (cost=0.00..44417.20 rows=2883220 width=12) (actual time=11.672..839.498 rows=2883220 loops=1)" " -> Materialize (cost=141143.93..152995.81 rows=948150 width=16) (actual time=1916.356..2253.004 rows=478358 loops=1)" " -> Sort (cost=141143.93..143514.31 rows=948150 width=16) (actual time=1916.344..2125.698 rows=478358 loops=1)" " Sort Key: t13.linked_movie_id" " Sort Method: external merge Disk: 29632kB" " -> Seq Scan on movie_link t13 (cost=0.00..14607.50 rows=948150 width=16) (actual time=27.610..297.962 rows=948150 loops=1)" " -> Hash (cost=1.18..1.18 rows=18 width=16) (actual time=0.020..0.020 rows=18 loops=1)" " -> Seq Scan on link_type t14 (cost=0.00..1.18 rows=18 width=16) (actual time=0.010..0.012 rows=18 loops=1)" " -> Hash (cost=1537.10..1537.10 rows=91010 width=24) (actual time=54.055..54.055 rows=91010 loops=1)" " -> Seq Scan on keyword t15 (cost=0.00..1537.10 rows=91010 width=24) (actual time=0.006..14.703 rows=91010 loops=1)" " -> Hash (cost=4585.61..4585.61 rows=245461 width=42) (actual time=445.269..445.269 rows=245461 loops=1)" " -> Seq Scan on company_name t16 (cost=0.00..4585.61 rows=245461 width=42) (actual time=12.037..309.961 rows=245461 loops=1)" " -> Hash (cost=1.04..1.04 rows=4 width=25) (actual time=0.013..0.013 rows=4 loops=1)" " -> Seq Scan on company_type t17 (cost=0.00..1.04 rows=4 width=25) (actual time=0.009..0.010 rows=4 loops=1)" " -> Hash (cost=1.04..1.04 rows=4 width=13) (actual time=0.006..0.006 rows=4 loops=1)" " -> Seq Scan on comp_cast_type t18 (cost=0.00..1.04 rows=4 width=13) (actual time=0.002..0.003 rows=4 loops=1)" "Total runtime: 147055.016 ms" Is there anyway to force the name.id = 2003 before it tries to join all the tables together? As you can see, the end result is 4 tuples but it seems like it should be a fast join by using the available index after it limited it down with the name clause, although very complex.

    Read the article

  • mysql query to dynamically convert row data to columns

    - by Anirudh Goel
    I am working on a pivot table query. The schema is as follows Sno, Name, District The same name may appear in many districts eg take the sample data for example 1 Mike CA 2 Mike CA 3 Proctor JB 4 Luke MN 5 Luke MN 6 Mike CA 7 Mike LP 8 Proctor MN 9 Proctor JB 10 Proctor MN 11 Luke MN As you see i have a set of 4 distinct districts (CA, JB, MN, LP). Now i wanted to get the pivot table generated for it by mapping the name against districts Name CA JB MN LP Mike 3 0 0 1 Proctor 0 2 2 0 Luke 0 0 3 0 i wrote the following query for this select name,sum(if(District="CA",1,0)) as "CA",sum(if(District="JB",1,0)) as "JB",sum(if(District="MN",1,0)) as "MN",sum(if(District="LP",1,0)) as "LP" from district_details group by name However there is a possibility that the districts may increase, in that case i will have to manually edit the query again and add the new district to it. I want to know if there is a query which can dynamically take the names of distinct districts and run the above query. I know i can do it with a procedure and generating the script on the fly, is there any other method too? I ask so because the output of the query "select distinct(districts) from district_details" will return me a single column having district name on each row, which i will like to be transposed to the column.

    Read the article

  • Transaction timeout expired while using Linq2Sql DataContext.SubmitChanges()

    - by user68923
    Hi guys, please help me resolve this problem: There is an ambient MSMQ transaction. I'm trying to use new transaction for logging, but get next error while attempt to submit changes - "Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding." Here is code: public static void SaveTransaction(InfoToLog info) { using (TransactionScope scope = new TransactionScope(TransactionScopeOption.RequiresNew)) { using (TransactionLogDataContext transactionDC = new TransactionLogDataContext()) { transactionDC.MyInfo.InsertOnSubmit(info); transactionDC.SubmitChanges(); } scope.Complete(); } } Please help me. Thx.

    Read the article

  • Linq To SQL - Specified cast is not valid - SingleOrDefault()

    - by NullReference
    I am trying to do the following... Request request = (from r in db.Requests where r.Status == "Processing" && r.Locked == false select r).SingleOrDefault(); It is throwing the following exception... Message: Specified cast is not valid. StackTrace: at System.Data.Linq.SqlClient.SqlProvider.Execute(Expression query, QueryInfo queryInfo, IObjectReaderFactory factory, Object[] parentArgs, Object[] userArgs, ICompiledSubQuery[] subQueries, Object lastResult) at System.Data.Linq.SqlClient.SqlProvider.ExecuteAll(Expression query, QueryInfo[] queryInfos, IObjectReaderFactory factory, Object[] userArguments, ICompiledSubQuery[] subQueries) at System.Data.Linq.SqlClient.SqlProvider.System.Data.Linq.Provider.IProvider.Execute(Expression query) at System.Data.Linq.DataQuery1.System.Linq.IQueryProvider.Execute[S](Expression expression) at System.Linq.Queryable.SingleOrDefault[TSource](IQueryable1 source) at GDRequestProcessor.Worker.GetNextRequest() Can anyone help me? Thanks in advance!

    Read the article

  • SSRS 2005: How do I make available varbinary data for download in a report?

    - by Angelo
    Hi, SSRS newbie question here... I have a table where one column is varbinary(max) data. I would like to make a report that makes this data available for download as a hyperlink so the user can just click on the item and get a file download dialog for the binary data. In this particular case, the binary data happens to be the content of old pdf files, but that shouldn't matter. I tried searching around but I can't find any pointers on how to do this. It seems to me that it should be possible. There are ways to display images in a report using varbinary data, so it makes sense that one should be able to make arbitrary binary data downloadable on a report, right?

    Read the article

< Previous Page | 594 595 596 597 598 599 600 601 602 603 604 605  | Next Page >