Search Results

Search found 22897 results on 916 pages for 'query processing'.

Page 41/916 | < Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >

  • Content Query Web Part - How do you OrderBy when you QueryOverride?

    - by Richard JP Le Guen
    How do you order items when you override the QueryOverride property of the Content Query Web Part? I have been given responsibility for a Web Part which extends the Content Query Web Part. The QueryOverride property of this Web Part is programmatically changed. Currently, the Web Part does not function as designed, as it does not order the items according to the appropriate field. If I add an <OrderBy> node to the QueryOverride property I get an error message along the lines of 'something wrong with the query this web part is...' and the Content Query Web Part doesn't seem to have an OrderBy property which I could use instead. The "QueryOverride property" part of this msdn article seems to suggest I should be able to add an <OrderBy> node to the QueryOverride but a number of web sites I've been reading suggest that this is not true. So, wow do you order items when you override the QueryOverride property of the Content Query Web Part?

    Read the article

  • How to retrieve only updated/new records since the last query in SQL?

    - by William Choi
    Hi all, I was asked to design a class for caching SQL query results. Calling the class' query method will query and cache the entire set of results at the first time; afterward, each subsequence query will retrieve only the updated portion, and will merge the result into the cache. If the class is required to be generic, i.e. NO knowledge about the db and the tables, do you have any idea? Is it possible, and how to retrieve only updated/new records since the last query? Thanks! William

    Read the article

  • How to test a lamda expression var query result is null?

    - by mike
    var query = from emp in dbEmp.Employees join dept in dbEmp.Departments on emp.DeptID equals dept.DeptID where dept.DepartmentName.Contains(this.TextBox1.Text) select new { EmpID = emp.EmpID, EmpName = emp.EmpName, Age = emp.Age, Address = emp.Address, DeptName = dept.DepartmentName }; if (query==null) Label1.Text = "no results match your search"; GridView1.DataSource = query; GridView1.DataBind(); Everything works in the right way, but the label doesn't show the message when query result returns null. The label can show without condition(querry==null). So how to test if a var query result returns nothing? Thanks

    Read the article

  • Adding a calculated field to a Query at run time.

    - by JamesB
    I'm getting data using a query in Delphi, and would like to add a calculated field to the query before it runs. The calculated field is using values in code as well as the query so I can't just calculate it in SQL. I know I can attach an OnCalcFields Event to actually make the calculation, but the problem is after adding the calculated field there are no other fields in the query... I did some digging and found that all of the field defs are created but the actual fields are only created if DefaultFields then CreateFields Default Fields is specified procedure TDataSet.DoInternalOpen; begin FDefaultFields := FieldCount = 0; ... end; Which would indicate that if you add fields you only get the fields you added. I would like all the fields in the query AS WELL AS the ones I Add. Is this possible or do I have to add all the fields I'm using as well?

    Read the article

  • Use lineout on Mac for audio processing

    - by eltufto
    Does anyone know of a way to set the linein on a Mac to route from the lineout? I want to essentially work with the music played on my Mac. An alternative solution I suppose would be to set the Linein as the output from iTunes/other music player. For context: My specific use-case is audio processing in Processing with the minim library, calling getLineIn. Trying to create some visuals that respond to the music being played.

    Read the article

  • SQL SERVER – ColumnStore Index – Batch Mode vs Row Mode

    - by pinaldave
    What do you do when you are in a hurry and hear someone say things which you do not agree or is wrong? Well, let me tell you what I do or what I recently did. I was walking by and heard someone mentioning “Columnstore Index are really great as they are using Batch Mode which makes them seriously fast.” While I was passing by and I heard this statement my first reaction was I thought Columnstore Index can use both – Batch Mode and Row Mode. I stopped by even though I was in a hurry and asked the person if he meant that Columnstore indexes are seriously fast because they use Batch Mode all the time or Batch Mode is one of the reasons for Columnstore Index to be faster. He responded that Columnstore Indexes can run only in Batch Mode. However, I do not like to confront anybody without hearing their complete story. Honestly, I like to do information sharing and avoid confronting as much as possible. There are always ways to communicate the same positively. Well, this is what I did, I quickly pull up my earlier article on Columnstore Index and copied the script to SQL Server Management Studio. I created two versions of the script. 1) Very Large Table 2) Reasonably Small Table. I a query which uses columnstore index on both of the versions. I found very interesting result of the my tests. I saved my tests and sent it to the person who mentioned about that Columnstore Indexes are using Batch Mode only. He immediately acknowledged that indeed he was incorrect in saying that Columnstore Index uses only Batch Mode. What really caught my attention is that he also thanked me for sending him detail email instead of just having argument where he and I both were standing in the corridor and neither have no way to prove any theory. Here is the screenshots of the both the scenarios. 1) Columnstore Index using Batch Mode 2) Columnstore Index using Row Mode Here is the logic behind when Columnstore Index uses Batch Mode and when it uses Row Mode. A batch typically represents about 1000 rows of data. Batch mode processing also uses algorithms that are optimized for the multicore CPUs and increased memory throughput.  Batch mode processing spreads metadata access costs and overhead over all the rows in a batch.  Batch mode processing operates on compressed data when possible leading superior performance. Here is one last point – Columnstore Index can use Batch Mode or Row Mode but Batch Mode processing is only available in Columnstore Index. I hope this statement truly sums up the whole concept. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Index, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Sql Query - Limiting query results

    - by Gublooo
    I am quite certain we cannot use the LIMIT clause for what I want to do - so wanted to find if there are any other ways we can accomplish this. I have a table which captures which user visited which store. Every time a user visits a store, a row is inserted into this table. Some of the fields are shopping_id (primary key) store_id user_id Now what I want is - for a given set of stores, find the top 5 users who have visited the store max number of times. I can do this 1 store at a time as: select store_id,user_id,count(1) as visits from shopping where store_id = 60 group by user_id,store_id order by visits desc Limit 5 This will give me the 5 users who have visited store_id=60 the max times What I want to do is provide a list of 10 store_ids and for each store fetch the 5 users who have visited that store max times select store_id,user_id,count(1) as visits from shopping where store_id in (60,61,62,63,64,65,66) group by user_id,store_id order by visits desc Limit 5 This will not work as the Limit at the end will return only 5 rows rather than 5 rows for each store. Any ideas on how I can achieve this. I can always write a loop and pass 1 store at a time but wanted to know if there is a better way

    Read the article

  • MDX: Problem filtering results in MDX query used in Reporting Services query

    - by wgpubs
    Why aren't my results being filtered by the members from my [Group Hierarchy] returned via the filter() statment below? SELECT NON EMPTY {[Measures].[Group Count], [Measures].[Overall Group Count] } ON COLUMNS, NON EMPTY { [Survey].[Surveys By Year].[Survey Year].ALLMEMBERS * [Response Status].[Response Status].[Response Status].ALLMEMBERS} DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME ON ROWS FROM ( SELECT ( { [Survey Type].[Survey Type Hierarchy].&[9] } ) ON COLUMNS FROM ( SELECT ( { [Response Status].[Response Status].[All] } ) ON COLUMNS FROM ( SELECT ( STRTOSET(@SurveySurveysByYear, CONSTRAINED) ) ON COLUMNS FROM ( SELECT(filter([Group].[Group Hierarchy].members, instr(@GroupGroupFullName,[Group].[Group Hierarchy].Properties( "Group Full Name" )))) on columns FROM [SysSurveyDW])))) CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS

    Read the article

  • Variable in one query is getting into another query in view

    - by Jason Shultz
    I have two foreach statements. The variable from one one is somehow getting into another and i'm not sure how to fix it. Here's my controller: // Categories Page Code function categories($id) { $this->load->model('Business_model'); $data['businessList'] = $this->Business_model->categoryPageList($id); $data['catList'] = $this->Business_model->categoryList(); $data['featured'] = $this->Business_model->frontPageList(); $data['user_id'] = $this->tank_auth->get_user_id(); $data['username'] = $this->tank_auth->get_username(); $data['page_title'] = 'Welcome To Jerome - Largest Ghost Town in America'; $data['page'] = 'category_view'; // pass the actual view to use as a parameter $this->load->view('container',$data); } What happens is categories will only show the businesses in a certain category. The businesses are pulled from the database using the categoryPageList($id) function. Here is that function: function categoryPageList($id) { $this->db->select('b.id, b.busname, b.busowner, b.webaddress, p.thumb, v.title, c.catname, s.specname, p.thumb, c.id'); $this->db->from ('business AS b'); $this->db->where('b.category', $id); $this->db->join('photos AS p', 'p.busid = b.id', 'left'); $this->db->join('video AS v', 'v.busid = b.id', 'left'); $this->db->join('specials AS s', 's.busid = b.id', 'left'); $this->db->join('category As c', 'b.category = c.id', 'left'); $this->db->group_by("b.id"); return $this->db->get(); } And here is the view: <h2>Welcome to Jerome, Arizona</h2> <p>Choose the Category of Business you are interested in:<br/> <?php foreach ($catList->result() as $row): ?> <a href="/site/categories/<?=$row->id?>"><?=$row->catname?></a>, &nbsp; <?php endforeach; ?></p> <table id="businessTable" class="tablesorter"> <thead><tr><th>Business Name</th><th>Business Owner</th><th>Web</th><th>Photos</th><th>Videos</th><th>Specials</th></tr></thead> <?php if(count($businessList) > 0) : foreach ($businessList->result() as $crow): ?> <tr> <td><a href="/site/business/<?=$crow->id?>"><?=$crow->busname?></a></td> <td><?=$crow->busowner?></td> <td><a href="<?=$crow->webaddress?>">Visit Site</a></td> <td> <?php if(isset($crow->thumb)):?> yes <?php else:?> no <?php endif?> </td> <td> <?php if(isset($crow->title)):?> yes <?php else:?> no <?php endif?> </td> <td> <?php if(isset($crow->specname)):?> yes <?php else:?> no <?php endif?> </td> </tr> <?php endforeach; ?> <?php else : ?> <td colspan="4"><p>No Category Selected</p></td> <?php endif; ?> </table> The problem occurs here. <?=$crow->id?> should be showing the row id from the business table. Instead, it's showing the row ID of the category table. so, if i'm viewing /site/categories/6 <?=$crow->id?> will show 6 when it should be showing 10 (the row ID of the only business in that category at this time. How can I fix this?

    Read the article

  • How to create a UDF that takes a query string and returns the query's resultset

    - by Martin
    I want to create a stored procedure that takes a simple SELECT statement and return the resultset as a CSV string. So the basic idea is get the sql statement from user input, run it using EXEC(@stmt) and convert the resultset to text using cursors. However, as SQLServer doesn't allow: select * from storedprocedure(@sqlStmt) UDF with EXEC(@sqlStmt) so I tried Insert into #tempTable EXEC(@sqlStmt), but this doesn't work (error = "invalid object name #tempTable"). I'm stuck. Could you please shed some light on this matter? Many thanks EDIT: Actually the output (e.g CSV string) is not important. The problem is I don't know how to assign a cursor to the resultset returned by EXEC. SP and UDF do not work with Exec() while creating a temp table before inserting values is impossible without knowing the input statement. I thought of OPENQUERY but it does not accept variables as its parameters.

    Read the article

  • LINQ Query using UDF that receives parameters from the query

    - by Ben Fidge
    I need help using a UDF in a LINQ which calculates a users position from a fixed point. int pointX = 567, int pointY = 534; // random points on a square grid var q = from n in _context.Users join m in _context.GetUserDistance(n.posY, n.posY, pointX, pointY, n.UserId) on n.UserId equals m.UserId select new User() { PosX = n.PosX, PosY = n.PosY, Distance = m.Distance, Name = n.Name, UserId = n.UserId }; The GetUserDistance is just a UDF that returns a single row in a TVP with that users distance from the points deisgnated in pointX and pointY variables, and the designer generates the following for it: [global::System.Data.Linq.Mapping.FunctionAttribute(Name="dbo.GetUserDistance", IsComposable=true)] public IQueryable<GetUserDistanceResult> GetUserDistance([global::System.Data.Linq.Mapping.ParameterAttribute(Name="X1", DbType="Int")] System.Nullable<int> x1, [global::System.Data.Linq.Mapping.ParameterAttribute(Name="X2", DbType="Int")] System.Nullable<int> x2, [global::System.Data.Linq.Mapping.ParameterAttribute(Name="Y1", DbType="Int")] System.Nullable<int> y1, [global::System.Data.Linq.Mapping.ParameterAttribute(Name="Y2", DbType="Int")] System.Nullable<int> y2, [global::System.Data.Linq.Mapping.ParameterAttribute(Name="UserId", DbType="Int")] System.Nullable<int> userId) { return this.CreateMethodCallQuery<GetUserDistanceResult>(this, ((MethodInfo)(MethodInfo.GetCurrentMethod())), x1, x2, y1, y2, userId); } when i try to compile i get The name 'n' does not exist in the current context

    Read the article

  • E: Sub-process /usr/bin/dpkg returned an error code (1)

    - by Joel
    I cant install or uppdate anything on my system 12.04 I get the error... installArchives() failed: dpkg: error processing libqt4-xmlpatterns (--configure): libqt4-xmlpatterns:amd64 4:4.8.1-0ubuntu4.2 cannot be configured because libqt4-xmlpatterns:i386 is in a different version (4:4.8.1-0ubuntu4.3) dpkg: error processing libqt4-xmlpatterns:i386 (--configure): libqt4-xmlpatterns:i386 4:4.8.1-0ubuntu4.3 cannot be configured because libqt4-xmlpatterns:amd64 is in a different version (4:4.8.1-0ubuntu4.2) dpkg: dependency problems prevent configuration of libqt4-declarative:i386: libqt4-declarative:i386 depends on libqt4-xmlpatterns (= 4:4.8.1-0ubuntu4.3); however: Package libqt4-xmlpatterns:i386 is not configured yet. dpkg: error processing libqt4-declarative:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-declarative: libqt4-declarative depends on libqt4-xmlpatterns (= 4:4.8.1-0ubuntu4.3); however: Version of libqt4-xmlpatterns on system is 4:4.8.1-0ubuntu4.2. dpkg: error processing libqt4-declarative (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqtgui4:i386: libqtgui4:i386 depends on libqt4-declarative (= 4:4.8.1-0ubuntu4.3); however: Package libqt4-declarative:i386 is not configured yet. dpkg: error processing libqtgui4:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqtgui4: No apport report written because the error message indicates its a followup error from a previous failure. No apport report written because the error message indicates its a followup error from a previous failure. No apport report written because MaxReports is reached already No apport report written because MaxReports is reached already No apport report written because MaxReports is reached already No apport report written because MaxReports is reached already libqtgui4 depends on libqt4-declarative (= 4:4.8.1-0ubuntu4.3); however: Package libqt4-declarative is not configured yet. dpkg: error processing libqtgui4 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-designer: libqt4-designer depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4 is not configured yet. dpkg: error processing libqt4-designer (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-designer:i386: libqt4-designer:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-designer:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-opengl: libqt4-opengl depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4 is not configured yet. dpkg: error processing libqt4-opengl (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-opengl:i386: libqt4-opengl:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-opengl:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-qt3support: libqt4-qt3support depends on libqt4-designer (= 4:4.8.1-0ubuntu4.3); however: Package libqt4-designer is not configured yet. libqt4-qt3support depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4 is not configured yet. dpkg: error processing libqt4-qt3support (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-qt3support:i386: libqt4-qt3support:i386 depends on libqt4-designer (= 4:4.8.1-0ubuntu4.3); however: Package libqt4-designer:i386 is not configured yet. libqt4-qt3support:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-qt3support:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-scripttools:i386: libqt4-scripttools:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-scripttools:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-svg: libqt4-svg depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4 is not configured yet. dpkg: error processing libqt4-svg (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-svg:i386: libqt4-svg:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-svg:i386 (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: libqt4-xmlpatterns libqt4-xmlpatterns:i386 libqt4-declarative:i386 libqt4-declarative libqtgui4:i386 libqtgui4 libqt4-designer libqt4-designer:i386 libqt4-opengl libqt4-opengl:i386 libqt4-qt3support libqt4-qt3support:i386 libqt4-scripttools:i386 libqt4-svg libqt4-svg:i386 Error in function: dpkg: dependency problems prevent configuration of libqt4-declarative: libqt4-declarative depends on libqt4-xmlpatterns (= 4:4.8.1-0ubuntu4.3); however: Version of libqt4-xmlpatterns on system is 4:4.8.1-0ubuntu4.2. dpkg: error processing libqt4-declarative (--configure): dependency problems - leaving unconfigured dpkg: error processing libqt4-xmlpatterns (--configure): libqt4-xmlpatterns:amd64 4:4.8.1-0ubuntu4.2 cannot be configured because libqt4-xmlpatterns:i386 is in a different version (4:4.8.1-0ubuntu4.3) dpkg: error processing libqt4-xmlpatterns:i386 (--configure): libqt4-xmlpatterns:i386 4:4.8.1-0ubuntu4.3 cannot be configured because libqt4-xmlpatterns:amd64 is in a different version (4:4.8.1-0ubuntu4.2) dpkg: dependency problems prevent configuration of libqtgui4: libqtgui4 depends on libqt4-declarative (= 4:4.8.1-0ubuntu4.3); however: Package libqt4-declarative is not configured yet. dpkg: error processing libqtgui4 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-declarative:i386: libqt4-declarative:i386 depends on libqt4-xmlpatterns (= 4:4.8.1-0ubuntu4.3); however: Package libqt4-xmlpatterns:i386 is not configured yet. dpkg: error processing libqt4-declarative:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-svg: libqt4-svg depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4 is not configured yet. dpkg: error processing libqt4-svg (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-opengl: libqt4-opengl depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4 is not configured yet. dpkg: error processing libqt4-opengl (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-designer: libqt4-designer depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4 is not configured yet. dpkg: error processing libqt4-designer (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-qt3support: libqt4-qt3support depends on libqt4-designer (= 4:4.8.1-0ubuntu4.3); however: Package libqt4-designer is not configured yet. libqt4-qt3support depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4 is not configured yet. dpkg: error processing libqt4-qt3support (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqtgui4:i386: libqtgui4:i386 depends on libqt4-declarative (= 4:4.8.1-0ubuntu4.3); however: Package libqt4-declarative:i386 is not configured yet. dpkg: error processing libqtgui4:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-svg:i386: libqt4-svg:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-svg:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-opengl:i386: libqt4-opengl:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.3); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-opengl:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-designer:i386: joel@Joel-PC:~$ sudo apt-get install -f [sudo] password for joel: Läser paketlistor... Färdig Bygger beroendeträd Läser tillståndsinformation... Färdig Korrigerar beroenden.... Färdig Följande paket har installerats automatiskt och är inte längre nödvändiga: kde-l10n-sv language-pack-kde-sv-base language-pack-kde-zh-hans-base calligra-l10n-engb calligra-l10n-sv calligra-l10n-zhcn language-pack-kde-en kde-l10n-engb language-pack-kde-sv language-pack-zh-hans-base kde-l10n-zhcn language-pack-zh-hans language-pack-kde-zh-hans language-pack-kde-en-base Använd "apt-get autoremove" för att ta bort dem. Följande ytterligare paket kommer att installeras: libqt4-xmlpatterns Följande paket kommer att uppgraderas: libqt4-xmlpatterns 1 att uppgradera, 0 att nyinstallera, 0 att ta bort och 22 att inte uppgradera. 15 är inte helt installerade eller borttagna. Behöver hämta 0 B/1 033 kB arkiv. Efter denna åtgärd kommer ytterligare 0 B utrymme användas på disken. Vill du fortsätta [J/n]? J dpkg: fel vid hantering av libqt4-xmlpatterns (--configure): libqt4-xmlpatterns:amd64 4:4.8.1-0ubuntu4.2 cannot be configured because libqt4-xmlpatterns:i386 is in a different version (4:4.8.1-0ubuntu4.3) dpkg: fel vid hantering av libqt4-xmlpatterns:i386 (--configure): libqt4-xmlpatterns:i386 4:4.8.1-0ubuntu4.3 cannot be configured because libqt4-xmlpatterns:amd64 is in a different version (4:4.8.1-0ubuntu4.2) dpkg: beroendeproblem förhindrar konfigurering av libqt4-declarative:i386: libqt4-declarative:i386 är beroende av libqt4-xmlpatterns (= 4:4.8.1-0ubuntu4.3), men: Paketet libqt4-xmlpatterns:i386 har inte konfigurerats ännu. dpkg: fel vid hantering av libqt4-declarative:i386 (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar konfigurering av libqt4-declarative: libqt4-declarative är beroende av libqt4-xmlpatterns (= 4:4.8.1-0ubuntu4.3), men: Versionen av libqt4-xmlpatterns på systemet är 4:4.8.1-0ubuntu4.2. dpkg: fel vid hantering av libqt4-declarative (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar konfigurering av libqtgui4:i386: libqtgui4:i386 är beroende av libqt4-declarative (= 4:4.8.1-0ubuntu4.3), men: Paketet libqt4-declarative:i386 har inte konfigurerats ännu. dpkg: fel vid hantering av libqtgui4:i386 (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar koIngen apport-rapport skrevs därför att felmeddelandet indikerar att det är ett efterföljande fel från ett tidigare problem. Ingen apport-rapport skrevs därför att felmeddelandet indikerar att det är ett efterföljande fel från ett tidigare problem. Ingen apport-rapport skrevs därför att felmeddelandet indikerar att det är ett efterföljande fel från ett tidigare problem. Ingen apport-rapport skrevs därför att felmeddelandet indikerar att det är ett efterföljande fel från ett tidigare problem. Ingen apport-rapport skrevs därför att felmeddelandet indikerar att det är ett efterföljande fel från ett tidigare problem. Ingen apport-rapport skrevs därför att felmeddelandet indikerar att det är ett efterföljande fel från ett tidigare problem. nfigurering av libqtgui4: libqtgui4 är beroende av libqt4-declarative (= 4:4.8.1-0ubuntu4.3), men: Paketet libqt4-declarative har inte konfigurerats ännu. dpkg: fel vid hantering av libqtgui4 (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar konfigurering av libqt4-designer: libqt4-designer är beroende av libqtgui4 (= 4:4.8.1-0ubuntu4.3), men: Paketet libqtgui4 har inte konfigurerats ännu. dpkg: fel vid hantering av libqt4-designer (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar konfigurering av libqt4-designer:i386: libqt4-designer:i386 är beroende av libqtgui4 (= 4:4.8.1-0ubuntu4.3), men: Paketet libqtgui4:i386 har inte konfigurerats ännu. dpkg: fel vid hantering av libqt4-designer:i386 (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar konfigurering av libqt4-opengl: libqt4-opengl är beroende av libqtgui4 (= 4:4.8.1-0ubuntu4.3), men: Paketet libqtgui4 har inte konfigurerats ännu. dpkg: fel vid hantering av libqt4-opengl (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar konfigurering av libqt4-opengl:i386: libqt4-opengl:i386 är beroende av libqtgui4 (= 4:4.8.1-0ubuntu4.3), men: Paketet libqtgui4:i386 har inte konfigurerats ännu. dpkg: fel vid hantering av libqt4-opengl:i386 (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar konfigurering av libqt4-qt3support: libqt4-qt3support är beroende av libqt4-designer (= 4:4.8.1-0ubuntu4.3), men: Paketet libqt4-designer har inte konfigurerats ännu. libqt4-qt3support är beroende av libqtgui4 (= 4:4.8.1-0ubuntu4.3), men: Paketet libqtgui4 har inte konfigurerats ännu. dpkg: fel vid hantering av libqt4-qt3support (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar konfigurering av libqt4-qt3support:i386: libqt4-qt3support:i386 är beroende av libqt4-designer (= 4:4.8.1-0ubuntu4.3), men: Paketet libqt4-designer:i386 har inte konfigurerats ännu. libqt4-qt3support:i386 är beroende av libqtgui4 (= 4:4.8.1-0ubuntu4.3), men: Paketet libqtgui4:i386 har inte konfigurerats ännu. dpkg: fel vid hantering av libqt4-qt3support:i386 (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar konfigurering av libqt4-scripttools:i386: libqt4-scripttools:i386 är beroende av libqtgui4 (= 4:4.8.1-0ubuntu4.3), men: Paketet libqtgui4:i386 har inte konfigurerats ännu. dpkg: fel vid hantering av libqt4-scripttools:i386 (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar konfigurering av libqt4-svg: libqt4-svg är beroende av libqtgui4 (= 4:4.8.1-0ubuntu4.3), men: Paketet libqtgui4 har inte konfigurerats ännu. dpkg: fel vid hantering av libqt4-svg (--configure): beroendeproblem - lämnar okonfigurerad dpkg: beroendeproblem förhindrar konfigurering av libqt4-svg:i386: libqt4-svg:i386 är beroende av libqtgui4 (= 4:4.8.1-0ubuntu4.3), men: Paketet libqtgui4:i386 har inte konfigurerats ännu. dpkg: fel vid hantering av libqt4-svg:i386 (--configure): beroendeproblem - lämnar okonfigurerad Fel uppstod vid hantering: libqt4-xmlpatterns libqt4-xmlpatterns:i386 libqt4-declarative:i386 libqt4-declarative libqtgui4:i386 libqtgui4 libqt4-designer libqt4-designer:i386 libqt4-opengl libqt4-opengl:i386 libqt4-qt3support libqt4-qt3support:i386 libqt4-scripttools:i386 libqt4-svg libqt4-svg:i386 E: Sub-process /usr/bin/dpkg returned an error code (1)

    Read the article

  • How can I do batch image processing with ImageJ in Java or clojure?

    - by Robert McIntyre
    I want to use ImageJ to do some processing of several thousand images. Is there a way to take any general imageJ plugin and apply it to hundreds of images automatically? For example, say I want to take my thousand images and apply a polar transformation to each--- A polar transformation plugin for ImageJ can be found here: http://rsbweb.nih.gov/ij/plugins/polar-transformer.html Great! Let's use it. From: [http://albert.rierol.net/imagej_programming_tutorials.html#How%20to%20automate%20an%20ImageJ%20dialog] I find that I can apply a plugin using the following: (defn x-polar [imageP] (let [thread (Thread/currentThread) options ""] (.setName thread "Run$_polar-transform") (Macro/setOptions thread options) (IJ/runPlugIn imageP "Polar_Transformer" ""))) This is good because it suppresses the dialog which would otherwise pop up for every image. But running this always brings up a window containing the transformed image, when what I want is to simply return the transformed image. The stupidest way to do what I want is to just close the window that comes up and return the image which it was displaying. Does what I want but is absolutely retarded: (defn x-polar [imageP] (let [thread (Thread/currentThread) options ""] (.setName thread "Run$_polar-transform") (Macro/setOptions thread options) (IJ/runPlugIn imageP "Polar_Transformer" "") (let [return-image (IJ/getImage)] (.hide return-image) return-image))) I'm obviously missing something about how to use imageJ plugins in a programming context. Does anyone know the right way to do this? Thanks, --Robert McIntyre

    Read the article

  • How can I do batch image processing with ImageJ in clojure?

    - by Robert McIntyre
    I want to use ImageJ to do some processing of several thousand images. Is there a way to take any general imageJ plugin and apply it to hundreds of images automatically? For example, say I want to take my thousand images and apply a polar transformation to each--- A polar transformation plugin for ImageJ can be found here: http://rsbweb.nih.gov/ij/plugins/polar-transformer.html Great! Let's use it. From: [http://albert.rierol.net/imagej_programming_tutorials.html#How%20to%20automate%20an%20ImageJ%20dialog] I find that I can apply a plugin using the following: (defn x-polar [imageP] (let [thread (Thread/currentThread) options ""] (.setName thread "Run$_polar-transform") (Macro/setOptions thread options) (IJ/runPlugIn imageP "Polar_Transformer" ""))) This is good because it suppresses the dialog which would otherwise pop up for every image. But running this always brings up a window containing the transformed image, when what I want is to simply return the transformed image. The stupidest way to do what I want is to just close the window that comes up and return the image which it was displaying. Does what I want but is absolutely retarded: (defn x-polar [imageP] (let [thread (Thread/currentThread) options ""] (.setName thread "Run$_polar-transform") (Macro/setOptions thread options) (IJ/runPlugIn imageP "Polar_Transformer" "") (let [return-image (IJ/getImage)] (.hide return-image) return-image))) I'm obviously missing something about how to use imageJ plugins in a programming context. Does anyone know the right way to do this? Thanks, --Robert McIntyre

    Read the article

  • Filtering SQLAlchemy query on attribute_mapped_collection field of relationship

    - by bsa
    I have two classes, Tag and Hardware, defined with a simple parent-child relationship (see the full definition at the end). Now I want to filter a query on Tag using the version field in Hardware through an attribute_mapped_collection, eg: def get_tags(order_code=None, hardware_filters=None): session = Session() query = session.query(Tag) if order_code: query = query.filter(Tag.order_code == order_code) if hardware_filters: for k, v in hardware_filters.iteritems(): query = query.filter(getattr(Tag.hardware, k).version == v) return query.all() But I get: AttributeError: Neither 'InstrumentedAttribute' object nor 'Comparator' object associated with Tag.hardware has an attribute 'baseband The same thing happens if I strip it back by hard-coding the attribute, eg: query.filter(Tag.hardware.baseband.version == v) I can do it this way: query = query.filter(Tag.hardware.any(artefact=k, version=v)) But why can't I filter directly through the attribute? Class definitions class Tag(Base): __tablename__ = 'tag' tag_id = Column(Integer, primary_key=True) order_code = Column(String, nullable=False) version = Column(String, nullable=False) status = Column(String, nullable=False) comments = Column(String) hardware = relationship( "Hardware", backref="tag", collection_class=attribute_mapped_collection('artefact'), ) __table_args__ = ( UniqueConstraint('order_code', 'version'), ) class Hardware(Base): __tablename__ = 'hardware' hardware_id = Column(Integer, primary_key=True) tag_id = Column(String, ForeignKey('tag.tag_id')) product_id = Column(String, nullable=True) artefact = Column(String, nullable=False) version = Column(String, nullable=False)

    Read the article

  • best practice for directory polling

    - by Hieu Lam
    Hi all, I have to do batch processing to automate business process. I have to poll directory at regular interval to detect new files and do processing. While old files is being processed, new files can come in. For now, I use quartz scheduler and thread synchronization to ensure that only one thread can process files. Part of the code are: application-context.xml <bean id="methodInvokingJob" class="org.springframework.scheduling.quartz.MethodInvokingJobDetailFactoryBean" <property name="targetObject" ref="documentProcessor" / <property name="targetMethod" value="processDocuments" / </bean DocumentProcessor ..... public void processDocuments() { LOG.info(Thread.currentThread().getName() + " attempt to run."); if (!processing) { synchronized (this) { try { processing = true; LOG.info(Thread.currentThread().getName() + " is processing"); List xmlDocuments = documentManager.getFileNamesFromFolder(incomingFolderPath); // loop over the files and processed unlock files. for (String xmlDocument : xmlDocuments) { processDocument(xmlDocument); } } finally { processing = false; } } } } For the current code, I have to prevent other thread to process files when one thread is processing. Is that a good idea ? or we support multi-threaded processing. In that case how can I know which files is being process and which files has just arrived ? Any idea is really appreciated.

    Read the article

  • Distributing processing for an application that wasn't designed with that in mind

    - by Tim
    We've got the application at work that just sits and does a whole bunch of iterative processing on some data files to perform some simulations. This is done by an "old" Win32 application that isn't multi-processor aware, so new(ish) computers and workstations are mostly sitting idle running this application. However, since it's installed by a typical Windows Install Shield installer, I can't seem to install and run multiple copies of the application. The work can be split up manually before processing, enabling the work to be distributed across multiple machines, but we still can't take advantage of multiple core CPUs. The results can be joined back together after processing to make a complete simulation. Is there a product out there that would let me "compartmentalize" an installation (or 4) so I can take advantage of a multi-core CPU? I had thought of using MS Softgrid, but I believe that still depends on a remote server to do the heavy lifting (though please correct me if I'm wrong). Furthermore, is there a way I can distribute the workload off the one machine? So an input could be split into 50 chunks, handed out to 50 machines, and worked on? All without really changing the initial application? In a perfect world, I'd get the application to take advantage of a DesktopGrid (BOINC), but like most "mission critical corporate applications", the need is there, but the money is not. Thank you in advance (and sorry if this isn't appropriate for serverfault).

    Read the article

  • SQL SERVER – What is Spatial Database? – Developing with SQL Server Spatial and Deep Dive into Spati

    - by pinaldave
    What is Spatial Database? A spatial database is a database that is optimized to store and query data related to objects in space, including points, lines and polygons. While typical databases can understand various numeric and character types of data, additional functionality needs to be added for databases to process spatial data types. (Source: Wikipedia) Today I will be talking about the same subject at Microsoft TechEd India. If you want to learn about how to spatial aspect of data and how to integrate them with SQL Server this is the perfect session for you. Spatial is very special concept of SQL Server and I really like how it is implemented in SQL Server. In general Performance Tuning and Query Optimization is something I always have enjoyed in my professional life. Index are my best friends and many time, by implementing and many time by removing I have improved the performance of the system. In this session, I will be talking about Index along with Spatial Data. As Spatial Database is very interesting concept, I will cover super short but very interesting 10 quick slides about this subject. I will make sure in very first 20 mins, you will understand following topics Introduction to Spatial Database One line definition Understanding Spatial Indexing Index Internals Query/Performance Tuning Query Hinting/Cost Analysis Spatial Index Catalog Views Performance Troubleshooting Finding Optimal Index using Spatial Index SP Common Errors Index Maintenance This slides decks will be followed by around 30 mins demo which will have story of geometry, geography, index internals and performance tuning. If you are interested in learning how GIS works and how SQL Server out of the box supports this wonderful tools, you will really like how the story is told. I am sure all people who attend the event will know how the Bangalore is positioned on the map of India. I will take example of Bangalore and Hyderabad and demonstrate how index can improve the performance. Well there are lots of story to tell in the session, and I will be opening this session with the beautiful script of Botticelli’s Birth of Venus created by Michael J. Swart. I will also demonstrate few real life scenario where I will be talking about Spatial Database and its usage. Do not miss this session. At the end of session there will be book awarded to best participant. My session details: Session 3: Developing with SQL Server Spatial and Deep Dive into Spatial Indexing Date: April 14, 2010 Time: 5:00pm-6:00pm Microsoft SQL Server 2008 delivers new spatial data types that enable you to consume, use, and extend location-based data through spatial-enabled applications. Attend this session to learn how to use spatial functionality in next version of SQL Server to build and optimize spatial queries. This session outlines the new geography data type to store geodetic spatial data and perform operations on it, use the new geometry data type to store planar spatial data and perform operations on it, take advantage of new spatial indexes for high performance queries, use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio, extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications and much more. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Index, SQL Optimization, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, T SQL, Technology Tagged: Spatial Database

    Read the article

  • SQL SERVER – SOS_SCHEDULER_YIELD – Wait Type – Day 8 of 28

    - by pinaldave
    This is a very interesting wait type and quite often seen as one of the top wait types. Let us discuss this today. From Book On-Line: Occurs when a task voluntarily yields the scheduler for other tasks to execute. During this wait the task is waiting for its quantum to be renewed. SOS_SCHEDULER_YIELD Explanation: SQL Server has multiple threads, and the basic working methodology for SQL Server is that SQL Server does not let any “runnable” thread to starve. Now let us assume SQL Server OS is very busy running threads on all the scheduler. There are always new threads coming up which are ready to run (in other words, runnable). Thread management of the SQL Server is decided by SQL Server and not the operating system. SQL Server runs on non-preemptive mode most of the time, meaning the threads are co-operative and can let other threads to run from time to time by yielding itself. When any thread yields itself for another thread, it creates this wait. If there are more threads, it clearly indicates that the CPU is under pressure. You can fun the following DMV to see how many runnable task counts there are in your system. SELECT scheduler_id, current_tasks_count, runnable_tasks_count, work_queue_count, pending_disk_io_count FROM sys.dm_os_schedulers WHERE scheduler_id < 255 GO If you notice a two-digit number in runnable_tasks_count continuously for long time (not once in a while), you will know that there is CPU pressure. The two-digit number is usually considered as a bad thing; you can read the description of the above DMV over here. Additionally, there are several other counters (%Processor Time and other processor related counters), through which you can refer to so you can validate CPU pressure along with the method explained above. Reducing SOS_SCHEDULER_YIELD wait: This is the trickiest part of this procedure. As discussed, this particular wait type relates to CPU pressure. Increasing more CPU is the solution in simple terms; however, it is not easy to implement this solution. There are other things that you can consider when this wait type is very high. Here is the query where you can find the most expensive query related to CPU from the cache Note: The query that used lots of resources but is not cached will not be caught here. SELECT SUBSTRING(qt.TEXT, (qs.statement_start_offset/2)+1, ((CASE qs.statement_end_offset WHEN -1 THEN DATALENGTH(qt.TEXT) ELSE qs.statement_end_offset END - qs.statement_start_offset)/2)+1), qs.execution_count, qs.total_logical_reads, qs.last_logical_reads, qs.total_logical_writes, qs.last_logical_writes, qs.total_worker_time, qs.last_worker_time, qs.total_elapsed_time/1000000 total_elapsed_time_in_S, qs.last_elapsed_time/1000000 last_elapsed_time_in_S, qs.last_execution_time, qp.query_plan FROM sys.dm_exec_query_stats qs CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp ORDER BY qs.total_worker_time DESC -- CPU time You can find the most expensive queries that are utilizing lots of CPU (from the cache) and you can tune them accordingly. Moreover, you can find the longest running query and attempt to tune them if there is any processor offending code. Additionally, pay attention to total_worker_time because if that is also consistently higher, then  the CPU under too much pressure. You can also check perfmon counters of compilations as they tend to use good amount of CPU. Index rebuild is also a CPU intensive process but we should consider that main cause for this query because that is indeed needed on high transactions OLTP system utilized to reduce fragmentations. Note: The information presented here is from my experience and there is no way that I claim it to be accurate. I suggest reading Book OnLine for further clarification. All of the discussions of Wait Stats in this blog is generic and varies from system to system. It is recommended that you test this on a development server before implementing it to a production server. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQL Wait Stats, SQL Wait Types, T SQL, Technology

    Read the article

  • SQL SERVER – Error: Fix – Msg 208 – Invalid object name ‘dbo.backupset’ – Invalid object name ‘dbo.backupfile’

    - by pinaldave
    Just a day before I got a very interesting email. Here is the email (modified a bit to make it relevant to this blog post). “Pinal, We are facing a very strange issue. One of our query  related to backup files and backup set has stopped working suddenly in SSMS. It works fine in application where we have and in the stored procedure but when we have it in our SSMS it gives following error. Msg 208, Level 16, State 1, Line 1 Invalid object name ‘dbo.backupfile’. Here are our queries which we are trying to execute. SELECT name, database_name, backup_size, TYPE, compatibility_level, backup_set_id FROM dbo.backupset; SELECT logical_name, backup_size, file_type FROM dbo.backupfile; This query gives us details related to backupset and backup files when the backup was taken.” When I receive this kind of email, usually I have no answers directly. The claim that it works in stored procedure and in application but not in SSMS gives me no real data. I have requested him to very first check following two things: If he is connected to correct server? His answer was yes. If he has enough permissions? His answer was he was logged in as an admin. This means there was something more to it and I requested him to send me a screenshot of the his SSMS. He promptly sends that to me and as soon as I receive the screen shot I knew what was going on. Before I say anything take a look at the screenshot yourself and see if you can figure out why his queries are not working in SSMS. Just to make your life a bit easy, I have already given a hint in the image. The answer is very simple, the context of the database is master database. To execute above two queries the context of the database has to be msdb. Tables backupset and backupfile belong to the database msdb only. Here are two workaround or solution to above problem: 1) Change context to MSDB Above two queries when they will run as following they will not error out and will give the accurate desired result. USE msdb GO SELECT name, database_name, backup_size, TYPE, compatibility_level, backup_set_id FROM dbo.backupset; SELECT logical_name, backup_size, file_type FROM dbo.backupfile; 2) Prefix the query with msdb There are cases above script used in stored procedure or part of big query, it is not possible to change the context of the whole query to any specific database. Use three part naming convention and prefix them with msdb. SELECT name, database_name, backup_size, TYPE, compatibility_level, backup_set_id FROM msdb.dbo.backupset; SELECT logical_name, backup_size, file_type FROM msdb.dbo.backupfile; Very simple solution but sometime keeps people wondering for an answer. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Error Messages, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Data Pages in Buffer Pool – Data Stored in Memory Cache

    - by pinaldave
    This will drop all the clean buffers so we will be able to start again from there. Now, run the following script and check the execution plan of the query. Have you ever wondered what types of data are there in your cache? During SQL Server Trainings, I am usually asked if there is any way one can know how much data in a table is stored in the memory cache? The more detailed question I usually get is if there are multiple indexes on table (and used in a query), were the data of the single table stored multiple times in the memory cache or only for a single time? Here is a query you can run to figure out what kind of data is stored in the cache. USE AdventureWorks GO SELECT COUNT(*) AS cached_pages_count, name AS BaseTableName, IndexName, IndexTypeDesc FROM sys.dm_os_buffer_descriptors AS bd INNER JOIN ( SELECT s_obj.name, s_obj.index_id, s_obj.allocation_unit_id, s_obj.OBJECT_ID, i.name IndexName, i.type_desc IndexTypeDesc FROM ( SELECT OBJECT_NAME(OBJECT_ID) AS name, index_id ,allocation_unit_id, OBJECT_ID FROM sys.allocation_units AS au INNER JOIN sys.partitions AS p ON au.container_id = p.hobt_id AND (au.type = 1 OR au.type = 3) UNION ALL SELECT OBJECT_NAME(OBJECT_ID) AS name, index_id, allocation_unit_id, OBJECT_ID FROM sys.allocation_units AS au INNER JOIN sys.partitions AS p ON au.container_id = p.partition_id AND au.type = 2 ) AS s_obj LEFT JOIN sys.indexes i ON i.index_id = s_obj.index_id AND i.OBJECT_ID = s_obj.OBJECT_ID ) AS obj ON bd.allocation_unit_id = obj.allocation_unit_id WHERE database_id = DB_ID() GROUP BY name, index_id, IndexName, IndexTypeDesc ORDER BY cached_pages_count DESC; GO Now let us run the query above and observe the output of the same. We can see in the above query that there are four columns. Cached_Pages_Count lists the pages cached in the memory. BaseTableName lists the original base table from which data pages are cached. IndexName lists the name of the index from which pages are cached. IndexTypeDesc lists the type of index. Now, let us do one more experience here. Please note that you should not run this test on a production server as it can extremely reduce the performance of the database. DBCC DROPCLEANBUFFERS This will drop all the clean buffers and we will be able to start again from there. Now run following script and check the execution plan for the same. USE AdventureWorks GO SELECT UnitPrice, ModifiedDate FROM Sales.SalesOrderDetail WHERE SalesOrderDetailID BETWEEN 1 AND 100 GO The execution plans contain the usage of two different indexes. Now, let us run the script that checks the pages cached in SQL Server. It will give us the following output. It is clear from the Resultset that when more than one index is used, datapages related to both or all of the indexes are stored in Memory Cache separately. Let me know what you think of this article. I had a great pleasure while writing this article because I was able to write on this subject, which I like the most. In the next article, we will exactly see what data are cached and those that are not cached, using a few undocumented commands. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: DMV, Pinal Dave, SQL, SQL Authority, SQL Optimization, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: SQL DMV

    Read the article

  • SQL SERVER – Maximize Database Performance with DB Optimizer – SQL in Sixty Seconds #054

    - by Pinal Dave
    Performance tuning is an interesting concept and everybody evaluates it differently. Every developer and DBA have different opinion about how one can do performance tuning. I personally believe performance tuning is a three step process Understanding the Query Identifying the Bottleneck Implementing the Fix While, we are working with large database application and it suddenly starts to slow down. We are all under stress about how we can get back the database back to normal speed. Most of the time we do not have enough time to do deep analysis of what is going wrong as well what will fix the problem. Our primary goal at that time is to just fix the database problem as fast as we can. However, here is one very important thing which we need to keep in our mind is that when we do quick fix, it should not create any further issue with other parts of the system. When time is essence and we want to do deep analysis of our system to give us the best solution we often tend to make mistakes. Sometimes we make mistakes as we do not have proper time to analysis the entire system. Here is what I do when I face such a situation – I take the help of DB Optimizer. It is a fantastic tool and does superlative performance tuning of the system. Everytime when I talk about performance tuning tool, the initial reaction of the people is that they do not want to try this as they believe it requires lots of the learning of the tool before they use it. It is absolutely not true with the case of the DB optimizer. It is a very easy to use and self intuitive tool. Once can get going with the product, in no time. Here is a quick video I have build where I demonstrate how we can identify what index is missing for query and how we can quickly create the index. Entire three steps of the query tuning are completed in less than 60 seconds. If you are into performance tuning and query optimization you should download DB Optimizer and give it a go. Let us see the same concept in following SQL in Sixty Seconds Video: You can Download DB Optimizer and reproduce the same Sixty Seconds experience. Related Tips in SQL in Sixty Seconds: Performance Tuning – Part 1 of 2 – Getting Started and Configuration Performance Tuning – Part 2 of 2 – Analysis, Detection, Tuning and Optimizing What would you like to see in the next SQL in Sixty Seconds video? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Interview Questions and Answers, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology, Video Tagged: Identity

    Read the article

  • SQL SERVER – Introduction to CUME_DIST – Analytic Functions Introduced in SQL Server 2012

    - by pinaldave
    This blog post is written in response to the T-SQL Tuesday post of Prox ‘n’ Funx. This is a very interesting subject. By the way Brad Schulz is my favorite guy when it is about blogging. I respect him as well learn a lot from him. Everybody is writing something new his subject, I decided to start SQL Server 2012 analytic functions series. SQL Server 2012 introduces new analytical function CUME_DIST(). This function provides cumulative distribution value. It will be very difficult to explain this in words so I will attempt small example to explain you this function. Instead of creating new table, I will be using AdventureWorks sample database as most of the developer uses that for experiment. Let us fun following query. USE AdventureWorks GO SELECT SalesOrderID, OrderQty, CUME_DIST() OVER(ORDER BY SalesOrderID) AS CDist FROM Sales.SalesOrderDetail WHERE SalesOrderID IN (43670, 43669, 43667, 43663) ORDER BY CDist DESC GO Above query will give us following result. Now let us understand what is the formula behind CUME_DIST and why the values in SalesOrderID = 43670 are 1. Let us take more example and be clear about why the values in SalesOrderID = 43667 are 0.5. Now let us enhence the same example and use PARTITION BY into the OVER clause and see the results. Run following query in SQL Server 2012. USE AdventureWorks GO SELECT SalesOrderID, OrderQty, ProductID, CUME_DIST() OVER(PARTITION BY SalesOrderID ORDER BY ProductID ) AS CDist FROM Sales.SalesOrderDetail s WHERE SalesOrderID IN (43670, 43669, 43667, 43663) ORDER BY s.SalesOrderID DESC, CDist DESC GO Now let us see the result of this query. We are have changed the ORDER BY clause as well partitioning by SalesOrderID. You can see that CUME_DIST() function provides us different results. Additionally now we see value 1 multiple times. As we are using partitioning for each group of SalesOrderID we get the CUME_DIST() value. CUME_DIST() was long awaited Analytical function and I am glad to see it in SQL Server 2012. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Function, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • SQL SERVER – Validating Spatial Object with IsValidDetailed Function

    - by pinaldave
    What do you prefer – error or warning indicating error may happen with the reason for the error. While writing the previous statement I remember the movie “Minory Report”. This blog post is not about minority report but I will still cover the concept in a single statement “Let us predict the future and prevent the crime which is about to happen in future”. (Please feel free to correct me if I am wrong about the movie concept, I really do not want to hurt your sentiment if you are dedicated fan). Let us switch to the SQL Server world. Spatial data types are interesting concepts. I love writing about spatial data types because it allows me to be creative with shapes (just like toddlers). When working with Spatial Datatypes it is all good when the spatial object works fine. However, when the spatial object has issue or it is created with invalid coordinates it used to give a simple error that there is an issue with the object but did not provide much information. This made it very difficult to debug. If this spatial object was used in the big procedure and while this big procedural error out because of the invalid spatial object, it is indeed very difficult to debug it. I always wished that the more information provided regarding what is the problem with spatial datatype. SQL Server 2012 has introduced the new function IsValidDetailed(). This function has made my life very easy. In simple words this function will check if the spatial object passed is valid or not. If it is valid it will give information that it is valid. If the spatial object is not valid it will return the answer that it is not valid and the reason for the same. This makes it very easy to debug the issue and make the necessary correction. DECLARE @p GEOMETRY = 'Polygon((2 2, 6 6, 4 2, 2 2))' SELECT @p.IsValidDetailed() GO DECLARE @p GEOMETRY = 'Polygon((2 2, 3 3, 4 4, 5 5, 6 6, 2 2))' SELECT @p.IsValidDetailed() GO DECLARE @p GEOMETRY = 'Polygon((2 2, 4 4, 4 2, 2 3, 2 2))' SELECT @p.IsValidDetailed() GO DECLARE @p GEOMETRY = 'CIRCULARSTRING(2 2, 4 4, 0 0)' SELECT @p.IsValidDetailed() GO DECLARE @p GEOMETRY = 'CIRCULARSTRING(2 2, 4 4, 0 0)' SELECT @p.IsValidDetailed() GO DECLARE @p GEOMETRY = 'LINESTRING(2 2, 4 4, 0 0)' SELECT @p.IsValidDetailed() GO Here is the resultset of the above query. You can see any valid query and some invalid query. If the query is invalid it also demonstrates the reason along with the error message. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Spatial Database, SQL Spatial

    Read the article

< Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >