Search Results

Search found 6355 results on 255 pages for 'slow downs'.

Page 92/255 | < Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >

  • Rethinking Oracle Optimizer Statistics for P6 Part 2

    - by Brian Diehl
    In the previous post (Part 1), I tried to draw some key insights about the relationship between P6 and Oracle Optimizer Statistics.  The first is that average cardinality has the greatest impact on query optimization and that the particular queries generated by P6 are more likely to use this average during calculations. The second is that these are statistics that are unlikely to change greatly over the life of the application. Ultimately, our goal is to get the best query optimization possible.  Or is it? Stability No application administrator wants to get the call at 9am that their application users cannot get there work done because everything is running slow. This is a possibility with a regularly scheduled nightly collection of statistics. It may not just be slow performance, but a complete loss of service because one or more queries are optimized poorly. Ideally, this should not be the case. The database optimizer should make better decisions with more up-to-date data. Better statistics may give incremental performance benefit. However, this benefit must be balanced against the potential cost of system down time.  It is stability that we ultimately desire and not absolute optimal performance. We do want the benefit from more accurate statistics and better query plans, but not at the risk of an unusable system. As a result, I've developed the following methodology around managing database statistics for the P6 database.  1. No Automatic Re-Gathering - The daily, weekly, or other interval of statistic gathering is unlikely to be beneficial. Quite the opposite. It is more likely to cause problems. 2. Smart Re-Gathering - The time to collect statistics is when things have changed significantly. For a new installation of P6, this is happening more often because the data is growing from a few rows to thousands and more. But for a mature system, the data is not changing significantly from week-to-week. There are times to collect statistics: New releases of the application Changes in the underlying hardware or software versions (ex. new Oracle RDBMS version) When additional user groups are added. The new groups may use the software in significantly different ways. After significant changes in the data. This may be monthly, quarterly or yearly.  3. Always Test - If you take away one thing from this post, it would be to always have a plan to test after changing statistics. In reality, statistics can be collected as often as you desire provided there are tests in place to verify that performance is the same or better. These might be automated tests or simply a manual script of application functions. 4. Have a Way Out - Never change the statistics without a way to return to the previous set. Think of the statistics as one part of the overall application code that also includes the source code--both application and RDBMS. It would be foolish to change to the new code without a way to get back to the previous version. In the final post, I will talk about the actual script I created for P6 PMDB and possible future direction for managing query performance. 

    Read the article

  • jQuery dropdown is really jumpy

    - by Matthew Westrik
    Hi, I'm making a dropdown menu with jquery (some of the code is borrowed from a tutorial by someone, although I forget who...). When I use the dropdown, it slides up and down really fast, and I can't figure it out. What do you think? HTML <div id="nav"> <ul class="topnav"> <li><a class="selected" href="#" title="home">home</a></li> <li><a href="events/" title="events calendar">events</a></li> <li><a href="photos/" title="photo gallery">photos</a></li> <li><a href="staff/" title="faculty">staff</a> <ul class="subnav"> <li><a href="#">Luke</a></li> <li><a href="#">Darth Vader</a></li> <li><a href="#">Princess Leia</a></li> <li><a href="#">Jabba the Hutt</a></li> </ul> </li> <li><a href="contact/" title="contact">contact</a></li> jQuery $(document).ready(function(){ $("ul.subnav").parent() .hover(function() { $(this).parent().find("ul.subnav").slideDown('fast').show(); //Drop down the subnav on hover... $(this).parent() .hover(function() { }, function(){ $(this).parent().find("ul.subnav").slideUp('slow'); }); $(this).parent().find("ul.subnav") .hover(function() { }, function(){ $(this).parent().find("ul.subnav").slideUp('slow'); }); }).hover(function() { $(this).addClass("subhover"); }, function(){ $(this).removeClass("subhover"); });

    Read the article

  • Get ListView Visible items

    - by Vincent Piel
    I have a ListView which might contains a lot of items, so it is virtualized and recycling items. It does not use sort. I need to refresh some value display, but when there are too many items, it is too slow to update everything, so I would like to refresh only the visible items. How could I get a list of all currently displayed items ? I tried to look into the ListView or in the ScrollViewer, but I still have no idea how to achieve this. The solution must NOT go through all items to test if they can be seen, because this would be too slow. I'm not sure code or xaml would be usefull, it is just a Virtualized/Recycling ListView with its ItemSource bound to an Array. Edit Answer : thanks to akjoshi, i found the way : get the ScrollViewer of the ListView (with a FindDescendant method, that you can do yourself with the VisualTreeHelper ). Then use its ScrollViewer.VerticalOffset : it is the number of the first item shown and ScrollViewer.ViewportHeight : it is the count of items shown. Rq : CanContentScroll must be true.

    Read the article

  • Fastest way to remove non-numeric characters from a VARCHAR in SQL Server

    - by Dan Herbert
    I'm writing an import utility that is using phone numbers as a unique key within the import. I need to check that the phone number does not already exist in my DB. The problem is that phone numbers in the DB could have things like dashes and parenthesis and possibly other things. I wrote a function to remove these things, the problem is that it is slow and with thousands of records in my DB and thousands of records to import at once, this process can be unacceptably slow. I've already made the phone number column an index. I tried using the script from this post: http://stackoverflow.com/questions/52315/t-sql-trim-nbsp-and-other-non-alphanumeric-characters But that didn't speed it up any. Is there a faster way to remove non-numeric characters? Something that can perform well when 10,000 to 100,000 records have to be compared. Whatever is done needs to perform fast. Update Given what people responded with, I think I'm going to have to clean the fields before I run the import utility. To answer the question of what I'm writing the import utility in, it is a C# app. I'm comparing BIGINT to BIGINT now, with no need to alter DB data and I'm still taking a performance hit with a very small set of data (about 2000 records). Could comparing BIGINT to BIGINT be slowing things down? I've optimized the code side of my app as much as I can (removed regexes, removed unneccessary DB calls). Although I can't isolate SQL as the source of the problem anymore, I still feel like it is.

    Read the article

  • jQuery UI - addClass removeClass - CSS values are stuck

    - by Jason D
    Hi, I'm trying to do a simple animation. You show the div. It animates correctly. You hide the div. Correct. You show the div again. It shows but there is no animation. It is stuck at the value of when you first interrupted it. So somehow the interpolation CSS that is happening during [add|remove]Class is getting stuck there. The second time around, the [add|remove]Class is actually running, but the css it's setting from the class is getting ignored (I think being overshadowed). How can I fix this WITHOUT resorting to .animate and hard-coded style values? The whole point was to put the animation end point in a css class. Thanks! <!doctype html> <style type="text/css"> div { width: 400px; height: 200px; } .green { background-color: green; } </style> <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js" type="text/javascript"></script> <script src="http://ajax.googleapis.com/ajax/libs/jqueryui/1.8/jquery-ui.min.js" type="text/javascript"></script> <script type="text/javascript"> $(function() { $('#show').bind({ click: function() { showAndRun() } }) $('#hide').bind({ click: function() { $('div').stop(true, false).fadeOut('slow') } }) function showAndRun() { function pulse() { $('div').removeClass('green', 2000, function() { $(this).addClass('green', 2000, pulse) }) } $('div').stop(true, false).hide().addClass('green').fadeIn('slow', pulse) } }) </script> <input id="show" type="button" value="show" /><input id="hide" type="button" value="hide" /> <div style="display: none;"></div>

    Read the article

  • Increment variable for

    - by John Doe
    Hello. I have 30 divs with the same class on the same page. When i'm pressing the title (.pull) the content is sliding up (.toggle_container). When the content is hidden and i'm pressing the title again, the content is sliding down. Also, i want to store the div state inside a cookie. I modified my original code to store the div state inside a cookie but it's not working for all the divs (pull1, toggle_container1, pull2, toggle_container2 [...]), it's working only for the first one (pull0, toggle_container0). What i'm doing wrong? var increment = 0; if ($.cookie('showTop') == 'collapsed') { $(".toggle_container" + increment).hide(); }else { $(".toggle_container" + increment).show(); }; $("a.pull" + increment).click(function () { if ($(".toggle_container" + increment).is(":hidden")) { $(".toggle_container" + increment).slideDown("slow"); $.cookie('showTop', 'expanded'); increment++; } else { $(".toggle_container" + increment).slideUp("slow"); $.cookie('showTop', 'collapsed'); increment++; } return false; });

    Read the article

  • Python: speed up removal of every n-th element from list.

    - by ChristopheD
    I'm trying to solve this programming riddle and althought the solution (see code below) works correct, it is too slow for succesful submission. Any pointers as how to make this run faster? (removal of every n-th element from a list)? Or suggestions for a better algorithm to calculate the same; seems I can't think of anything else then brute-force for now... Basically the task at hand is: GIVEN: L = [2,3,4,5,6,7,8,9,10,11,........] 1. Take the first remaining item in list L (in the general case 'n'). Move it to the 'lucky number list'. Then drop every 'n-th' item from the list. 2. Repeat 1 TASK: Calculate the n-th number from the 'lucky number list' ( 1 <= n <= 3000) My current code (it calculates the 3000 first lucky numbers in about a second on my machine - but unfortunately too slow): """ SPOJ Problem Set (classical) 1798. Assistance Required URL: http://www.spoj.pl/problems/ASSIST/ """ sieve = range(3, 33900, 2) luckynumbers = [2] while True: wanted_n = input() if wanted_n == 0: break while len(luckynumbers) < wanted_n: item = sieve[0] luckynumbers.append(item) items_to_delete = set(sieve[::item]) sieve = filter(lambda x: x not in items_to_delete, sieve) print luckynumbers[wanted_n-1]

    Read the article

  • Advice needed on best and most efficient practices with developing google apps application...

    - by Ali
    Hi guys , I'm getting my feet wet with developing my order management applications for integration with google apps. However there are certain aspects I need to take into consideration prior to proceeding any further. My application is such that it would upload documents to google documents and store contacts in google contacts. It requires such that a single order can have a number of uploaded documents associated with it as well as some contacts associated with it. MY question however is what would be the most efficient way to implement this. I could keep key tables for both contacts and documents which woudl contain just an ID and link to the documents/contacts or their respective identification id on google. Or I could maintain an exact replica of the information on my own database as well as a link to the contact on google. However won't that be too redundant. I don't want my application to be really slow as I'm afraid that everytime I make a call to google docs to retrieve a list of documents or google contacts it would be really slow on my application - or am I getting worried for no reason? Any advice would be most appreciated.

    Read the article

  • JQuery Simple Modal OSX Multiple dialogs

    - by Aneef
    HI, I'm planning to use the jquery Simple modal for login and registration on my project site. i tried to have 2 modals as mentioned here. but im still unable to make it work. here is my code jQuery(function ($) { var OSX = { container: null, init: function () { $("a.osx").click(function (e) { e.preventDefault(); $(this.id + "_osx-modal-content").modal({ overlayId: this.id+'_osx-overlay', containerId: this.id+'_osx-container', closeHTML: null, minHeight: 80, opacity: 65, position: ['0',], overlayClose: true, onOpen: OSX.open, onClose: OSX.close }); }); }, open: function (d) { var self = this; self.container = d.container[0]; d.overlay.fadeIn('slow', function () { $("#osx-modal-content", self.container).show(); var title = $("#osx-modal-title", self.container); title.show(); d.container.slideDown('slow', function () { setTimeout(function () { var h = $("#osx-modal-data", self.container).height() + title.height() + 20; // padding d.container.animate( {height: h}, 200, function () { $("div.close", self.container).show(); $("#osx-modal-data", self.container).show(); } ); }, 300); }); }) }, close: function (d) { var self = this; // this = SimpleModal object d.container.animate( {top:"-" + (d.container.height() + 20)}, 500, function () { self.close(); // or $.modal.close(); } ); } }; OSX.init(); I guess its something to do with the open: function part, anyone can help me ?

    Read the article

  • Fastest image iteration in Python

    - by Greg
    I am creating a simple green screen app with Python 2.7.4 but am getting quite slow results. I am currently using PIL 1.1.7 to load and iterate the images and saw huge speed-ups changing from the old getpixel() to the newer load() and pixel access object indexing. However the following loop still takes around 2.5 seconds to run for an image of around 720p resolution: def colorclose(Cb_p, Cr_p, Cb_key, Cr_key, tola, tolb): temp = math.sqrt((Cb_key-Cb_p)**2+(Cr_key-Cr_p)**2) if temp < tola: return 0.0 else: if temp < tolb: return (temp-tola)/(tolb-tola) else: return 1.0 .... for x in range(width): for y in range(height): Y, cb, cr = fg_cbcr_list[x, y] mask = colorclose(cb, cr, cb_key, cr_key, tola, tolb) mask = 1 - mask bgr, bgg, bgb = bg_list[x,y] fgr, fgg, fgb = fg_list[x,y] pixels[x,y] = ( (int)(fgr - mask*key_color[0] + mask*bgr), (int)(fgg - mask*key_color[1] + mask*bgg), (int)(fgb - mask*key_color[2] + mask*bgb)) Am I doing anything hugely inefficient here which makes it run so slow? I have seen similar, simpler examples where the loop is replaced by a boolean matrix for instance, but for this case I can't see a way to replace the loop. The pixels[x,y] assignment seems to take the most amount of time but not knowing Python very well I am unsure of a more efficient way to do this. Any help would be appreciated.

    Read the article

  • Best full text search for mysql?

    - by ConroyP
    We're currently running MySQL on a LAMP stack and have been looking at implementing a more thorough, full-text search on our site. We've looked at MySQL's own freetext search, but it doesn't seem to cope well with large databases, which makes it far too slow for our needs. Our main requirements are: speed returning results simple updating of index In addition to the above, our "nice to have"s are: ideally not something that requires adding a module to MySQL plays nicely with PHP (majority of our dev work done using PHP) There seems to be quite a few healthy open-source projects to add fast, reliable full-text search to MySQL, so I'm basically looking for recommendations/suggestions on what you've found to be the most useful product out there, easiest to set up, etc. So far, the list of ones we've been starting to play around with are: Sphinx, C++ based, used by craigslist, thepiratebay Lucene, Java-based Apache project, powers zeoh.com and zoomf.com Solr, Java-based offshoot of Lucene, used to power searches on Digg, CNet & AOL Channels Are there any better ones out there that we haven't come across yet? Can you recommend / suggest against any of the options we've gathered so far? Thanks for your help! Update @Cletus suggested Google's Custom Search Engine. We recently trialled this on a couple of projects, and it's an almost-perfect fit for our needs. The problem is that entries on our site are updated quite regularly, and unfortunately the speed at which entries go in/get updated in Google's index was just too slow and erratic for us to rely on, even with the addition of sitemaps and requested crawl rate changes.

    Read the article

  • Increase Query Speed in PostgreSQL

    - by Anthoni Gardner
    Hello, First time posting here, but an avid reader. I am experiancing slow query times on my database (all tested locally thus far) and not sure how to go about it. The database itself has 44 tables and some of them tables have over 1 Million records (mainly the movies, actresses and actors tables). The table is made via JMDB using the flat files on IMDB. Also the SQL query that I am about to show is from that said program (that too experiances very slow search times). I have tried to include as much information as I can, such as the explain plan etc. "QUERY PLAN" "HashAggregate (cost=46492.52..46493.50 rows=98 width=46)" " Output: public.movies.title, public.movies.movieid, public.movies.year" " - Append (cost=39094.17..46491.79 rows=98 width=46)" " - HashAggregate (cost=39094.17..39094.87 rows=70 width=46)" " Output: public.movies.title, public.movies.movieid, public.movies.year" " - Seq Scan on movies (cost=0.00..39093.65 rows=70 width=46)" " Output: public.movies.title, public.movies.movieid, public.movies.year" " Filter: (((title)::text ~~* '%Babe%'::text) AND ((title)::text !~~* '""%}'::text))" " - Nested Loop (cost=0.00..7395.94 rows=28 width=46)" " Output: public.movies.title, public.movies.movieid, public.movies.year" " - Seq Scan on akatitles (cost=0.00..7159.24 rows=28 width=4)" " Output: akatitles.movieid, akatitles.language, akatitles.title, " Filter: (((title)::text ~~* '%Babe%'::text) AND ((title)::text !~~* '""%}'::text))" " - Index Scan using movies_pkey on movies (cost=0.00..8.44 rows=1 width=46)" " Output: public.movies.movieid, public.movies.title, public.movies.year, public.movies.imdbid" " Index Cond: (public.movies.movieid = akatitles.movieid)" SELECT * FROM ((SELECT DISTINCT title, movieid, year FROM movies WHERE title ILIKE '%Babe%' AND NOT (title ILIKE '"%}')) UNION (SELECT movies.title, movies.movieid, movies.year FROM movies INNER JOIN akatitles ON movies.movieid=akatitles.movieid WHERE akatitles.title ILIKE '%Babe%' AND NOT (akatitles.title ILIKE '"%}'))) AS union_tmp2; Returns 612 Rows in 9078ms Database backup (plain text) is 1.61GB It's a really complex query and I am not fully cognizant on it, like I said it was spat out by JMDB. Do you have any suggestions on how I can increase the speed ? Regards Anthoni

    Read the article

  • jquery slide toggle divs without creating multiple classes, functions, etc... ui accordion

    - by SPE
    Greetings, based on the jquery ui accordion I'm using, I have added a slide toggle to my accordion list items. so what happens is I click on an li and a div slides down underneath to reveal more content. The issue I'm having is that I find myself having to create multiple id's to reference the slidetoggle. example of the div id css: #panel, #panel2, #panel3, #panel4, etc.... the Is there a way I can use the slide toggle without having to adding another number so it will slide? I have 50 list items I'm using: Here's a sample of the js (as you can see where I'm going with this): $(".btn-slide").click(function(){ $("#panel").slideToggle("slow"); $(this).toggleClass("active"); return false; }); $(".btn-slide2").click(function(){ $("#panel2").slideToggle("slow"); $(this).toggleClass("active"); return false; }); sample html: <li><div class="slide"><a href="#" class="btn-slide">One</a></div><div id="panel"></div></li> <li><div class="slide"><a href="#" class="btn-slide2">Two</a></div><div id="panel2"></div></li>

    Read the article

  • Problem with qTip - Tips not showing because elements load after the script

    - by msvalkon
    Hey, I'm not very experienced with javascript, jQuery or it's plugins but usually I manage. Anyways, my client is building a site and one of its purposes is to pick up news articles from different sites and show the titles in unordered html lists. I don't have access to his code, the news articles load up rather slow(much after the site has loaded). I'm using qTIP and the idea is that once you hover over a news title, it will generate a tooltip. This works fine in my dev environment, because I have dummy title's that are not generated from anywhere. The problem is that once the client sets the site up in his test environment, the scripts that load the news titles into the lists are so slow, that the qTIP-script loads before there are any elements in the lists. Hence it's not aware of any <li>'s to pick up and generate tooltips from. Is there a way make sure that ALL of the news articles are loaded before the tooltip-script loads? I think that a simple delay in loading the script is not very smart because some of the titles seem to take longer to load than others, so the delay would have to be rather long.

    Read the article

  • Improve speed of own debug visualizer for Delphi 2010

    - by netcodecz
    I wrote Delphi debug visualizer for TDataSet to display values of current row, source + screenshot: http://delphi.netcode.cz/text/tdataset-debug-visualizer.aspx . Working good, but very slow. I did some optimalization (how to get fieldnames) but still for only 20 fields takes 10 seconds to show - very bad. Main problem seems to be slow IOTAThread90.Evaluate used by main code shown below, this procedure cost most of time, line with ** about 80% time. FExpression is name of TDataset in code. procedure TDataSetViewerFrame.mFillData; var iCount: Integer; I: Integer; // sw: TStopwatch; s: string; begin // sw := TStopwatch.StartNew; iCount := StrToIntDef(Evaluate(FExpression+'.Fields.Count'), 0); for I := 0 to iCount - 1 do begin s:= s + Format('%s.Fields[%d].FieldName+'',''+', [FExpression, I]); // FFields.Add(Evaluate(Format('%s.Fields[%d].FieldName', [FExpression, I]))); FValues.Add(Evaluate(Format('%s.Fields[%d].Value', [FExpression, I]))); //** end; if s<> '' then Delete(s, length(s)-4, 5); s := Evaluate(s); s:= Copy(s, 2, Length(s) -2); FFields.CommaText := s; { sw.Stop; s := sw.Elapsed; Application.MessageBox(Pchar(s), '');} end; Now I have no idea how to improve performance.

    Read the article

  • Debugging SQL Server Slowness: Same Database, Different Servers

    - by Craig Walker
    For a while now we've been having anecdotal slowness on our newly-minted (VMWare-based) SQL Server 2005 database servers. Recently the problem has come to a head and I've started looking for the root cause of the issue. Here's the weird part: on the stored procedure that I'm using as a performance test case, I get a 30x difference in the execution speed depending on which DB server I run it on. This is using the same database (mdf) and log (ldf) files, detached, copied, and reattached from the slow server to the fast one. This doesn't appear to be a (virtualized) hardware issue: he slow server has 4x the CPU capacity and 2x the memory as the fast one. As best as I can tell, the problem lies in the environment/configuration of the servers (either operating system or SQL Server installation). However, I've checked a bunch of variables (SQL Server config options, running services, disk fragmentation) and found nothing that has made a difference in testing. What things should I be looking at? What tools can I use to investigate why this is happening?

    Read the article

  • what is this internet explorer (javascript?) syntax error -2146827286?

    - by ben
    Hi everyone, So, I've been trying to troubleshoot this bug where some big percentage of windows users who are on my ajax (jquery) web app are not able to play. I haven't been able to reproduce it on my end with a windows 7 IE8 running in a parallels vm. The main problem seems to be in the javascript somewhere because what users are complaining about is an ajax button isn't working. They click it and nothing happens, so either the event isn't firing, or my ajax call is failing, and possibly the return from the ajax could be failing. After trying some ideas, a friend suggested I check out damnit! https://damnit.jupiterit.com/ which will catch exceptions in javascript and email them to you. This is a pretty awesome tool! So, now i have a little more data, but, am stuck. Basically it seems like the majority of the exceptions seem to be complaining about a syntax error. I will paste the samples below. message: Syntax error number: -2146827286 description: Syntax error Browser: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; InfoPath.2; .NET CLR 1.0.3705; OfficeLiveConnector.1.3; OfficeLivePatch.0.0) What's interesting is that the syntax error is consistently occurring in browsers reporting MSIE 8.0 but with windows vista, xp, and below, so older OS with the latest IE. Does anyone know of this error? Could this possibly be some weird slow computer/slow internet connection thing where may be my javascript files aren't fully loaded before I call the functions. I am using the jquery $(document).ready() to wait before I setup anything.

    Read the article

  • How to scale a sprite image without losing color key information?

    - by Michael P
    Hello everyone, I'm currently developing a simple application that displays map and draws some markers on it. I'm developing for Windows Mobile, so I decided to use DirectDraw and Imaging interfaces to make the application fast and pretty. The map moves when user moves finger on the touchscreen, so the whole map moving/scrolling animation has to be fast, but it is not. On every map update I have to draw portion of the map, control buttons, and markers - buttons and markers are preloaded on DirectDraw surface as a mipmap. So the only thing I do is BitBlit from the mipmap to a back buffer, and from the back buffer to a primary surface (I can't use page flipping due to the windowed mode of my application). Previously I used premultiplied-alpha surface with 32 bit ARGB pixel format for images mipmap, everything was looking good, but drawing entire "scene" was horribly slow - i could forget about smooth map scrolling. Now I'm using mipmap with native (RGB565) pixel format and fuchsia (0xFF00FF) color key. Drawing is much better my mipmap surface is generated on program loading - images are loaded from files, scaled (with filtering) and drawn on mipmap. The problem is, that image scaling process blends pixel colors, and those pixels which are on the border of a sprite region are blended with surrounding fuchsia pixels resulting semi-fuchsia color that is not treated as color key. When I do blitting with color key option, sprites have small fuchsia-like borders, and it looks really bad. How to solve this problem? I can use alpha blitting, but it is too slow - even in ARGB 1555 format.

    Read the article

  • Running Magento for multiple clients - single Installaton vs. multiple installations

    - by Chris Hopkins
    Hi There I am looking to set-up a Magento (Community Edition) installation for multiple clients and have researched the matter for a few days now. I can see that the Enterprise Edition has what I need in it, but surprisingly I am not willing to shell out the $12,000 odd yearly subscription. It seems there are a few options available to be but I am worried about the performance I will get out of the various options. Option 1) Single install using AITOC advanced permissions module So this is really what I am after; one installation so that I can update my core files all at the same time and also manage all my store users from one place. The problems here are that I don't know anything about the reliability of this extra product and that I have to pay a bit extra. I am also worried that if I have 10 stores running off this one installation it might all slow down so much and keel over as I have heard allot about Magento's slowness. Module Link: http://www.aitoc.com/en/magentomods_advanced_permissions.html Option 2) Multiple installations of Magento on one server for each shop So here I have 10 Magento installations on one server all running happily away not using any extra money, but I now have 10 separate stores to update and maintain which could be annoying. Also I haven't been able to find a whole lot of other people using this method and when I have they are usually asking how to stop their servers from dying. So this route seems like it could be even worse on my server as I will have more going on on my server but if my server could take it each Magento installation would be simpler and less likely to slow down due to each one having to run 10 shops on its own? Option 3) Use lots of servers and lots of Magento installations I just so do not want to do this. Option 4) Buy Magento Enterprise I do not have the money to do this. So which route is less likely to blow up my server? And does anyone have experience with this holy grail of a module? Thanks for reading and thanks in advance for any help - Chris Hopkins

    Read the article

  • Are SharePoint site templates really less performant than site definitions?

    - by Jim
    So, it seems in the SharePoint blogosphere that everybody just copies and pastes the same bullet points from other blogs. One bullet point I've seen is that SharePoint site templates are less performant than site definitions because site definitions are stored on the file system. Is that true? It seems odd that site templates would be less performant. It's my understanding that all site content lives in a database, whether you use a site template or a site definition. A site template is applied once to the database, and from then on the site should not care if the content was created using a site template or not. So, does anybody have an architectural reason why a site template would be less performant than a site definition? Edit: Links to the blogs that say there is a performance difference: From MSDN: Because it is slow to store templates in and retrieve them from the database, site templates can result in slower performance. From DevX: However, user templates in SharePoint can lead to performance problems and may not be the best approach if you're trying to create a set of reusable templates for an entire organization. From IT Footprint: Because it is slow to store templates in and retrieve them from the database, site templates can result in slower performance. Templates in the database are compiled and executed every time a page is rendered. From Branding SharePoint:Custom site definitions hold the following advantages over custom templates: Data is stored directly on the Web servers, so performance is typically better. At a minimum, I think the above articles are incomplete, and I think several are misleading based on what I know of SharePoints architecture. I read another blog post that argued against the performance differences, but I can't find the link.

    Read the article

  • sybase - fails to use index unless string is hard-coded

    - by Garrett
    I'm using Sybase 12.5.3 (ASE); I'm new to Sybase though I've worked with MSSQL pretty extensively. I'm running into a scenario where a stored procedure is really very slow. I've traced the issue to a single SELECT stmt for a relatively large table. Modifying that statement dramatically improves the performance of the procedure (and reverting it drastically slows it down; i.e., the SELECT stmt is definitely the culprit). -- Sybase optimizes and uses multi-column index... fast!<br> SELECT ID,status,dateTime FROM myTable WHERE status in ('NEW','SENT') ORDER BY ID -- Sybase does not use index and does very slow table scan<br> SELECT ID,status,dateTime FROM myTable WHERE status in (select status from allowableStatusValues) ORDER BY ID The code above is an adapted/simplified version of the actual code. Note that I've already tried recompiling the procedure, updating statistics, etc. I have no idea why Sybase ASE would choose an index only when strings are hard-coded and choose a table scan when choosing from another table. Someone please give me a clue, and thank you in advance.

    Read the article

  • Ajax -- re-load div on success.

    - by RPM
    I am trying to accomplish a "re-load." More specifically, I need to be able to refresh a portion of my page as a result of another successful ajax call. Moreover, I load my portion of the page via ajax which obtains it's content from an ajax post. The result is my content being displayed inside my portion precisely. I need this portion of the page refreshed after a successful ajax post. Here is some of the code: /* Ajax-- this part is loaded automatically, and I need it reloaded upon success of another ajax post. This data comes from the outcome of my other ajax function. */ $('#newCo').load('click', function() { $.ajax({ url: 'index.php?dkd432k=uBus/310/Indeed', dataType: 'json', success: function(json) { if (json['newCompare']) { $('#newCo .newResults').html(json['newCompare']); } } }); }); The next portion of code is responsible for posting the data of which I obtain in this above ajax function. function ZgHiapud (ofWhich) { $.ajax({ url: 'index.php?dkd432k=uBus/310/update', type: 'post', data: 'product_id=' + product_id, dataType: 'json', success: function(json) { $('.success, .warning, .attention, .information').remove(); if (json['success']) { $('.attention').fadeIn('slow'); $('#compare_total').html(json['total']); $('html, body').animate({ scrollTop: 0 }, 'slow'); } } }); } In the end, I need to obtain the data that I send to the server immediately upon success of the second ajax call. This data that is sent via the second ajax call needs to fire the first ajax call upon success.

    Read the article

  • Help, my CentOS servers keep going down , No route to host after a random uptime [closed]

    - by user249071
    Hello , I have a couple of Centos linux servers, that have a very simple task, they run nginx + fastcgi for php , and some NFS mounts between them, readonly They have some RPC commands to start some downloading processes with wget, nothing fancy , from a main server, but their behavior is very unstable, they simply go down, we tried to monitor ram , processor usage, even network connections, they don't load up so much, max network connections up to... 250 max, 15% processor usage and memory , well, doesn't even fill up, 2.5GB from 8GB max , I have no ideea why can a linux server go down like that, they aren't even public servers, no domain names installed no public serving, for sites. The only thing that I've discovered was that if i didn't restart the network service every couple of hours or so... the servers were becoming very slow, starting apps very slow, but not repoting a high usage of resources...Maybe Centos doesn't free the timeout connections, or something like that...It's based on Red Hat right? I'm not a linux expert , but I'm sure that there are a few guys out there that can easily have an answer to this , or even have some leads to what i can do ... I haven't installed snort, or other things to view if we have some DOS attacks, still the scheduled script that restarts the network each hour should put the system back online, and it doesn't.... Thank you in advance

    Read the article

  • How to get DIVs into this code via JQuery

    - by ludz
    Heya everyone I been struggling along with this piece of code for the longest time, its driving me insane. I am trying many different things and looking at past posts here but nothing seems to be helping. Basicly i have a jquery pagination code in place and i want to add animated transitions between pages. With some assistance i got that working correctly however this causes the page to jump around as the new items fade in and out. To fix this i need a DIV wrapped around each selection of results. I have been trying to use .wrap .html .wrapinner .append and i cant get any of it to work properly. The 2 areas where i beleive the code needs to be place are as follows: $('#content').children().slice(0, show_per_page).css('display', 'block'); and $('#content').children().fadeOut('slow').slice(start_from, end_on).fadeIn('slow'); Full original code: http://tutsvalley.com/tutorials/making-a-jquery-pagination-system/ Only the second line of code posted here has been altered. Basicly i want to wrap each group of sliced output in a DIV. I hope that makes sense, if you need anymore information please let me know. Any ideas or suggestions on what to try it would be much apreciated as its currently driving me crazy :) Ludz~

    Read the article

  • Jquery animate bottom problem

    - by andrea
    It's strange. I have an animation that start when i .hover on a DIV. Into this div i have another in position absolute that i want to move up until the mouse stay on main div and go down in your position when i go out of main div. A simple hover effect. I have write down this function that work well apart a strange bug. var inside = $('.inside') inside.hover(function() { $(this).children(".coll_base").stop().animate({ bottom:'+=30px' },"slow") }, function() { $(this).children(".coll_base").stop().animate({ bottom:'-=30px' },"slow"); }); i need +=30 and -=30 because i have to apply this function to a range of div with different bottom. The problem is that if i hover on and out of div the DIV animate right but when go down he decrease the value of bottom every animation. If i go up and down from the main div i see that the animated div go down even beyond the main div. I think that i miss something to resolve this bug. Thanks for any help

    Read the article

< Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >