Search Results

Search found 48586 results on 1944 pages for 'page performance'.

Page 162/1944 | < Previous Page | 158 159 160 161 162 163 164 165 166 167 168 169  | Next Page >

  • Theming the default search results page in Drupal

    - by David Pratt
    I'm trying to customise the mark-up of the default search results page in Drupal 6. Specifically I'd like to remove the search box and the title from the page - I know I can hide it with CSS, but I'd rather it wasn't rendered in the first place. Ideally, in the same why that you theme a particular content type node by copying the node.tpl.php and renaming it to something like node-blog.tpl.php and then amending the mark-up accordingly - is there an equivalent way to do this for the search results page?

    Read the article

  • DataView Vs DataTable.Select()

    - by Aseem Gautam
    Considering the code below: Dataview someView = new DataView(sometable) someView.RowFilter = someFilter; if(someView.count > 0) { …. } Quite a number of articles which say Datatable.Select() is better than using DataViews, but these are prior to VS2008. Solved: The Mystery of DataView's Poor Performance with Large Recordsets Array of DataRecord vs. DataView: A Dramatic Difference in Performance So in a situation where I just want a subset of datarows based on some filter criteria(single query) and what is better DataView or DataTable.Select()?

    Read the article

  • Should we denormalize database to improve performance?

    - by Groo
    We have a requirement to store 500 measurements per second, coming from several devices. Each measurement consists of a timestamp, a quantity type, and several vector values. Right now there is 8 vector values per measurement, and we may consider this number to be constant for needs of our prototype project. We are using HNibernate. Tests are done in SQLite (disk file db, not in-memory), but production will probably be MsSQL. Our Measurement entity class is the one that holds a single measurement, and looks like this: public class Measurement { public virtual Guid Id { get; private set; } public virtual Device Device { get; private set; } public virtual Timestamp Timestamp { get; private set; } public virtual IList<VectorValue> Vectors { get; private set; } } Vector values are stored in a separate table, so that each of them references its parent measurement through a foreign key. We have done a couple of things to ensure that generated SQL is (reasonably) efficient: we are using Guid.Comb for generating IDs, we are flushing around 500 items in a single transaction, ADO.Net batch size is set to 100 (I think SQLIte does not support batch updates? But it might be useful later). The problem Right now we can insert 150-200 measurements per second (which is not fast enough, although this is SQLite we are talking about). Looking at the generated SQL, we can see that in a single transaction we insert (as expected): 1 timestamp 1 measurement 8 vector values which means that we are actually doing 10x more single table inserts: 1500-2000 per second. If we placed everything (all 8 vector values and the timestamp) into the measurement table (adding 9 dedicated columns), it seems that we could increase our insert speed up to 10 times. Switching to SQL server will improve performance, but we would like to know if there might be a way to avoid unnecessary performance costs related to the way database is organized right now. [Edit] With in-memory SQLite I get around 350 items/sec (3500 single table inserts), which I believe is about as good as it gets with NHibernate (taking this post for reference: http://ayende.com/Blog/archive/2009/08/22/nhibernate-perf-tricks.aspx). But I might as well switch to SQL server and stop assuming things, right? I will update my post as soon as I test it.

    Read the article

  • Login Page using ASP.Net & Ajax

    - by user1293474
    I'm trying to make a login page using html, ajax & ASP.NET.The data is truly passed to the ajax function, but when I debug the asp page the username and password are sent with NULL. The code is supposed to take username & password then returns the userid Html page: <div id="usernameid">Username:</div><input id="username" type="text"/> <span id="username_status"></span> <div id="passwordid">Password:</div><input id="password" type="password"/> <span id="password_status"></span> <div> <input id="loginbutton" onclick="UserLogin()" type="submit" value="Submit" /></div> Javascript: function UserLogin() { var postData = JSON.stringify({ "username": JSON.stringify($("#username").val()), "password": JSON.stringify($("#password").val()) }); alert(postData); $.ajax({ type: "GET", url: "http://localhost:49317/LoginPageForLearn.aspx", data: postData, contentType: "application/json; charset=utf-8", dataType: "jsonp", jsonp: 'jsoncallback', success: callbackfunction, error: function (msg) { alert(msg); } }); } Asp.net page: protected void Page_Load(object sender, EventArgs e) { string userName = ""; int userId = -1; string PassWord = ""; if (Request.QueryString.Count != 0 && Request.QueryString["username"] != string.Empty && Request.QueryString["password"] != string.Empty) { userName = Request.QueryString["username"]; PassWord = Request.QueryString["password"]; userId = GetUserID(userName, PassWord); } } Do you have any ideas why isn't the data passed correctly ? Or do you have any other ideas on how can I make a login page using html and access the data at SQL. Thanks a lot.

    Read the article

  • Java - Collections.sort() performance

    - by msr
    Hello, Im using Collections.sort() to sort a LinkedList whose elements implements Comparable interface, so they are sorted in a natural order. In the javadoc documentation its said this method uses mergesort algorithm wich has n*log(n) performance. My question is if there is a more efficient algorithm to sort my LinkedList? The size of that list could be very high and sort will be also very frequent. Thanks!

    Read the article

  • page transitions using javascript

    - by hasan
    hey, i saw this on a site a couple of days ago and i cant seem to find it again. in any case, this is what was on the site: the page opened regularly when you entered the url. upon clicking one of the links on the page, it "transitioned" to the next page (there was a color change). and the url in the address bar was changed to reflect that. eg: if the background was blue on site.com, when clicking on the about link, the background would change to green and the browser would show site.com/about. and so on. also, if the url entered was site.com/about, the bg would be green and on cliking the home page, the site would transition from green to blue and browser would show site.com im interested in finding out how this was done. searching on google got me the meta-refresh tag, but the ffect was much more complex and worked on all browsers. is there any other method out there?

    Read the article

  • Web page database query optimization

    - by morpheous
    I am putting together a web page which is quite 'expensive' in terms of database hits. I don't want to start optimizing at this stage - though with me trying to hit a deadline, I may end up not optimizing at all. Currently the page requires 18 (that's right eighteen) hits to the db. I am already using joins, and some of the queries are UNIONed to minimize the trips to the db. My local dev machine can handle this (page is not slow) however, I feel if I release this into the wild, the number of queries will quickly overwhelm my database (MySQL). I could always use memcache or something similar, but I would much rather continue with my other dev work that needs to be completed before the deadline - at least retrieving the page works - its simply a matter of optimization now (if required). My question therefore is - is 18 db queries for a single page retrieval completely outrageous - (i.e. I should put everything on hold and optimize the hell of the retrieval logic), or shall I continue as normal, meet the deadline and release on schedule and see what happens? [Edit] Just to clarify, I have already done the 'obvious' things like using (single and composite) indexes for fields used in the queries. What I haven't yet done is to run a query analyzer to see if my indexes etc are optimal.

    Read the article

  • Caching Mysql database for better performance

    - by kobey
    Hi, I'm using Amazon cloud and I've performance issue since the HDD is not located on my machine. My database is small (~500MB) and I can afford to keep it all in my RAM. I do not want to keep queries in my RAM, i need all the tables there. How can i do it? Thanks, Koby P.S. I'm using ubuntu server...

    Read the article

  • Django: How do I position a page when using Django templates

    - by swisstony
    I have a web page where the user enters some data and then clicks a submit button. I process the data and then use the same Django template to display the original data, the submit button, and the results. When I am using the Django template to display results, I would like the page to be automatically scrolled down to the part of the page where the results begin. This allows the user to scroll back up the page if she wants to change her original data and click submit again. Hopefully, there's some simple way of doing this that I can't see at the moment.

    Read the article

  • I am totally unable to add a fileTree (JQuery fileTree addon) to my asp.net page

    - by Gadgetsan
    okay, so i have an asp.net (C#) application and i want to add a list of file and folders on the page, so i figured i should use JQuery fileTree (http://abeautifulsite.net/2008/03/jquery-file-tree/#download) but now i am totally unable to display the file list. I initialise the page this way: Site.Master: <link rel="stylesheet" type="text/css" href="../../Content/superfish.css" media="screen"> <link href="../../Content/jqueryFileTree.css" rel="stylesheet" type="text/css" /> <script src="../../Scripts/jquery-1.4.1.min.js" type="text/javascript"></script> <script src="../../Scripts/jquery.easing.1.3.js" type="text/javascript"></script> <script src="../../Scripts/jqueryFileTree.js" type="text/javascript"></script> <script src="../../Scripts/JqueryUI/js/jquery-ui-1.8.1.custom.min.js" type="text/javascript"></script> <script type="text/javascript" src="../../Scripts/jquery.dataTables.js"></script> <script type="text/javascript" src="../../Scripts/superfish.js"></script> <script type="text/javascript"> $(document).ready(function() { test = $('#fileTree').fileTree({script: "jqueryFileTree.aspx" }, function(file) { openFile(file); }); $("button").button(); oTable = $('#data').dataTable({ "bJQueryUI": true, "sPaginationType": "full_numbers", "bSort": true }); }); </script> and in the page, i put my div this way: <asp:Content ID="Content2" ContentPlaceHolderID="MainContent" runat="server"> Documents but i'm positive that jqueryFileTree.aspx is never "called" because if i return this page in my controller, it shows the list of files/folder correctly, so it's also not a problem with my aspx connector... Also i checked, on the JS console, it gives no error and there is nothing more in the page source code i've been trying to solve this all day without success so your help is apreciated

    Read the article

  • Index on column with only 2 distinct values

    - by Will
    I am wondering about the performance of this index: I have an "Invalid" varchar(1) column that has 2 values: NULL or 'Y' I have an index on (invalid), as well as (invalid, last_validated) Last_validated is a datetime (this is used for a unrelated SELECT query) I am flagging a small amount of items (1-5%) of rows in the table with this as 'to be deleted'. This is so when i DELETE FROM items WHERE invalid='Y' it does not perform a full table scan for the invalid items. A problem seems to be, the actual DELETE is quite slow now, possibly because all the indexes are being removed as they are deleted. Would a bitmap index provide better performance for this? or perhaps no index at all?

    Read the article

  • Javascript function, on web page close

    - by AXheladini
    Hello there, I have an problem that i can not solve. The problem is this way: When i close my page on some browser i want an message box to appear and to ask me if I really want to close the page or not. logically: click on x (close tab on web browser) and than the box appears with two buttons yes and now, if i click yes the page will be closed if i click no the page will not close. I know there must be some javascript or ajax code but i can not configure it by my self.

    Read the article

  • Performance tuning of a Hibernate+Spring+MySQL project operation that stores images uploaded by user

    - by Umar
    Hi I am working on a web project that is Spring+Hibernate+MySQL based. I am stuck at a point where I have to store images uploaded by a user into the database. Although I have written some code that works well for now, but I believe that things will mess up when the project would go live. Here's my domain class that carries the image bytes: @Entity public class Picture implements java.io.Serializable{ long id; byte[] data; ... // getters and setters } And here's my controller that saves the file on submit: public class PictureUploadFormController extends AbstractBaseFormController{ ... protected ModelAndView onSubmit(HttpServletRequest request, HttpServletResponse response, Object command, BindException errors) throws Exception{ MutlipartFile file; // getting MultipartFile from the command object ... // beginning hibernate transaction ... Picture p=new Picture(); p.setData(file.getBytes()); pictureDAO.makePersistent(p); // this method simply calls getSession().saveOrUpdate(p) // committing hiernate transaction ... } ... } Obviously a bad piece of code. Is there anyway I could use InputStream or Blob to save the data, instead of first loading all the bytes from the user into the memory and then pushing them into the database? I did some research on hibernate's support for Blob, and found this in Hibernate In Action book: java.sql.Blob and java.sql.Clob are the most efficient way to handle large objects in Java. Unfortunately, an instance of Blob or Clob is only useable until the JDBC transaction completes. So if your persistent class defines a property of java.sql.Clob or java.sql.Blob (not a good idea anyway), you’ll be restricted in how instances of the class may be used. In particular, you won’t be able to use instances of that class as detached objects. Furthermore, many JDBC drivers don’t feature working support for java.sql.Blob and java.sql.Clob. Therefore, it makes more sense to map large objects using the binary or text mapping type, assuming retrieval of the entire large object into memory isn’t a performance killer. Note you can find up-to-date design patterns and tips for large object usage on the Hibernate website, with tricks for particular platforms. Now apparently the Blob cannot be used, as it is not a good idea anyway, what else could be used to improve the performance? I couldn't find any up-to-date design pattern or any useful information on Hibernate website. So any help/recommendations from stackoverflowers will be much appreciated. Thanks

    Read the article

  • javascript: waiting for an iframe page to load before writing to it (but not from the page that's tr

    - by Bill Dawes
    Apologies if this has been answered elsewhere, but I haven't been able to find it referenced. (Probably because nobody else would want to do such a daft thing, I admit). So, I have a page with three iframes in it. An event on one triggers a javascript function which loads new pages into the other two iframes; ['topright'] and ['bottomright']. However, javascript in the page that is being loaded into iframe 'topright' then needs to send information to elements in the 'bottomright' iframe. window.frames['bottomright'].document.subform.ID_client = client; etc But this will only work if the page has fully loaded into the bottomright frame. So what would be the most efficient way for that code in the 'topright' iframe to check and ensure that that form element in the bottomright frame is actually available to write to, before it does write to it? Bearing in mind that the page load has NOT been triggered from the topright frame, so I can't simply use an onLoad function. (I know this probably sounds like a hideously tortuous route for getting data from one page to another, but that's another story. The client is always right, etc...:-))

    Read the article

  • Why does dojo parsing time depend on css and images availability?

    - by Kniganapolke
    I have been profiling javascript on my page that uses dojo widgets. I don't use explicit parsing - the parser runs on page load. What I noticed is that if I clear browser cache before refreshing the page, dojo parsing takes much more time than if all the files are already cached. Note that we build all the required dojo modules into a layer (a single file), so we don't lazy-load any js files. I wonder if dojo parsing process depends on images and css resources, as far as I know it only instantiates widgets and injects dom nodes. Do you have any ideas why dojo parser runs longer (2-3 times longer in my case) when the cache is cleared?

    Read the article

  • Executing javascript script after ajax-loaded a page - doesn't work

    - by Deukalion
    I'm trying to get a page with AJAX, but when I get that page and it includes Javascript code - it doesn't execute it. Why? Simple code in my ajax page: <script type="text/javascript"> alert("Hello"); </script> ...and it doesn't execute it. I'm trying to use Google Maps API and add markers with AJAX, so whenever I add one I execute a AJAX page that gets the new marker, stores it in a database and should add the marker "dynamically" to the map. But since I can't execute a single javascript function this way, what do I do? Is my functions that I've defined on the page beforehand protected or private? ** UPDATED WITH AJAX FUNCTION ** function ajaxExecute(id, link, query) { if (query != null) { query = query.replace("amp;", ""); } if (window.XMLHttpRequest) { // code for IE7+, Firefox, Chrome, Opera, Safari xmlhttp=new XMLHttpRequest(); } else { // code for IE6, IE5 xmlhttp=new ActiveXObject("Microsoft.XMLHTTP"); } xmlhttp.onreadystatechange=function() { if (xmlhttp.readyState==4 && xmlhttp.status==200) { if (id != null) { document.getElementById(id).innerHTML=xmlhttp.responseText; } } } if (query == null) { xmlhttp.open("GET",link,true); } else { if (query.substr(0, 1) != "?") { xmlhttp.open("GET",link+"?"+query,true); } else { xmlhttp.open("GET",link+query,true); } } xmlhttp.send(); }

    Read the article

  • Are bit operations quick?

    - by flashnik
    I'm dealing with a problem which needs to work with a lot of data. Currently its' values are represented as unsigned int. I know that real values do not exceed some limit, say 1000. That means that I can use unsigned short to store it. One profit is that it'll use less space. Do I have to pay for it by loosing in performance? Another assumption. I decided to store data as short but all calling functions use int, so I need to convert between these datatypes when storing/extracting values. Wiil the performance lost be dramatic? Third assumption. Due to great wish to econom memory I decided to use not short but just 10 bits packed into array of unsigned int. What will happen in this case comparing with previous ones?

    Read the article

  • PHP auto refresh page without losing user input

    - by Tony
    I'm working on a PHP collaboration software project. I have a page that shows the latest updates from other users who are adding content to the database, but also has a form input to allow the user to enter text. I am currently using this code to refresh the page automatically every 30 seconds: header('Refresh: 30'); The problem is that the header code refreshes the entire page, and not just what is pulled from the database. Is there any PHP code that will just pull any new data from the database without refreshing the entire page? If someone could point me in the right direction I'd appreciate it.

    Read the article

< Previous Page | 158 159 160 161 162 163 164 165 166 167 168 169  | Next Page >