Search Results

Search found 17045 results on 682 pages for 'high cpu usage'.

Page 128/682 | < Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >

  • Beginners php developer does using LiveDocx white Zend Framework is cpu resource eater ?

    - by user63898
    Hello all im beginner in the php world i need to build option in web application that can convert well defined structures into rtf/pdf from txt/html i found using this site search about LiveDocx php component that is dependent on Zend Framework now im not familiar white the php engine ( the parser ) so im asking you experts is it good solution to use this components ? or its just over head ?

    Read the article

  • Peer review maatkits mk-parallel-dump and mk-parallel-restore usage?

    - by Brent
    Hiya Im trying to make use of maatkit as a means of dumping a database and then restoring to another database. For dumps: mk-parallel-dump --user abc --password xyz --databases $db --base-dir /tmp/dump For restore: mk-parallel-restore --create-databases --user abc --password xyz --database devdb /tmp/dump My question is, is my logic and understanding correct, and would it be ok to do it like this. Kind Regards Brent

    Read the article

  • C# immutable object usage best practices? Should I be using them as much as possible?

    - by Daniel
    Say I have a simple object such as class Something { public int SomeInt { get; set; } } I have read that using immutable objects are faster and a better means of using business objects? If this is so, should i strive to make all my objects as such: class ImmutableSomething { public int SomeInt { get { return m_someInt; } } private int m_someInt = 0; public void ChangeSomeInt(int newValue) { m_someInt = newvalue; } } What do you reckon?

    Read the article

  • Can I spead out a long running stored proc accross multiple CPU's?

    - by Russ
    [Also on SuperUser - http://superuser.com/questions/116600/can-i-spead-out-a-long-running-stored-proc-accross-multiple-cpus] I have a stored procedure in SQL server the gets, and decrypts a block of data. ( Credit cards in this case. ) Most of the time, the performance is tolerable, but there are a couple customers where the process is painfully slow, taking literally 1 minute to complete. ( Well, 59377ms to return from SQL Server to be exact, but it can vary by a few hundred ms based on load ) When I watch the process, I see that SQL is only using a single proc to perform the whole process, and typically only proc 0. Is there a way I can change my stored proc so that SQL can multi-thread the process? Is it even feasible to cheat and to break the calls in half, ( top 50%, bottom 50% ), and spread the load, as a gross hack? ( just spit-balling here ) My stored proc: USE [Commerce] GO /****** Object: StoredProcedure [dbo].[GetAllCreditCardsByCustomerId] Script Date: 03/05/2010 11:50:14 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[GetAllCreditCardsByCustomerId] @companyId UNIQUEIDENTIFIER, @DecryptionKey NVARCHAR (MAX) AS SET NoCount ON DECLARE @cardId uniqueidentifier DECLARE @tmpdecryptedCardData VarChar(MAX); DECLARE @decryptedCardData VarChar(MAX); DECLARE @tmpTable as Table ( CardId uniqueidentifier, DecryptedCard NVarChar(Max) ) DECLARE creditCards CURSOR FAST_FORWARD READ_ONLY FOR Select cardId from CreditCards where companyId = @companyId and Active=1 order by addedBy desc --2 OPEN creditCards --3 FETCH creditCards INTO @cardId -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN --OPEN creditCards DECLARE creditCardData CURSOR FAST_FORWARD READ_ONLY FOR select convert(nvarchar(max), DecryptByCert(Cert_Id('Oh-Nay-Nay'), EncryptedCard, @DecryptionKey)) FROM CreditCardData where cardid = @cardId order by valueOrder OPEN creditCardData FETCH creditCardData INTO @tmpdecryptedCardData -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN print 'CreditCardData' print @tmpdecryptedCardData set @decryptedCardData = ISNULL(@decryptedCardData, '') + @tmpdecryptedCardData print '@decryptedCardData' print @decryptedCardData; FETCH NEXT FROM creditCardData INTO @tmpdecryptedCardData -- fetch next END CLOSE creditCardData DEALLOCATE creditCardData insert into @tmpTable (CardId, DecryptedCard) values ( @cardId, @decryptedCardData ) set @decryptedCardData = '' FETCH NEXT FROM creditCards INTO @cardId -- fetch next END select CardId, DecryptedCard FROM @tmpTable CLOSE creditCards DEALLOCATE creditCards

    Read the article

  • Can I spread out a long running stored proc accross multiple CPU's?

    - by Russ
    [Also on SuperUser - http://superuser.com/questions/116600/can-i-spead-out-a-long-running-stored-proc-accross-multiple-cpus] I have a stored procedure in SQL server the gets, and decrypts a block of data. ( Credit cards in this case. ) Most of the time, the performance is tolerable, but there are a couple customers where the process is painfully slow, taking literally 1 minute to complete. ( Well, 59377ms to return from SQL Server to be exact, but it can vary by a few hundred ms based on load ) When I watch the process, I see that SQL is only using a single proc to perform the whole process, and typically only proc 0. Is there a way I can change my stored proc so that SQL can multi-thread the process? Is it even feasible to cheat and to break the calls in half, ( top 50%, bottom 50% ), and spread the load, as a gross hack? ( just spit-balling here ) My stored proc: USE [Commerce] GO /****** Object: StoredProcedure [dbo].[GetAllCreditCardsByCustomerId] Script Date: 03/05/2010 11:50:14 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[GetAllCreditCardsByCustomerId] @companyId UNIQUEIDENTIFIER, @DecryptionKey NVARCHAR (MAX) AS SET NoCount ON DECLARE @cardId uniqueidentifier DECLARE @tmpdecryptedCardData VarChar(MAX); DECLARE @decryptedCardData VarChar(MAX); DECLARE @tmpTable as Table ( CardId uniqueidentifier, DecryptedCard NVarChar(Max) ) DECLARE creditCards CURSOR FAST_FORWARD READ_ONLY FOR Select cardId from CreditCards where companyId = @companyId and Active=1 order by addedBy desc --2 OPEN creditCards --3 FETCH creditCards INTO @cardId -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN --OPEN creditCards DECLARE creditCardData CURSOR FAST_FORWARD READ_ONLY FOR select convert(nvarchar(max), DecryptByCert(Cert_Id('Oh-Nay-Nay'), EncryptedCard, @DecryptionKey)) FROM CreditCardData where cardid = @cardId order by valueOrder OPEN creditCardData FETCH creditCardData INTO @tmpdecryptedCardData -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN print 'CreditCardData' print @tmpdecryptedCardData set @decryptedCardData = ISNULL(@decryptedCardData, '') + @tmpdecryptedCardData print '@decryptedCardData' print @decryptedCardData; FETCH NEXT FROM creditCardData INTO @tmpdecryptedCardData -- fetch next END CLOSE creditCardData DEALLOCATE creditCardData insert into @tmpTable (CardId, DecryptedCard) values ( @cardId, @decryptedCardData ) set @decryptedCardData = '' FETCH NEXT FROM creditCards INTO @cardId -- fetch next END select CardId, DecryptedCard FROM @tmpTable CLOSE creditCards DEALLOCATE creditCards

    Read the article

  • Can I spread out a long running stored proc accross multiple CPU's?

    - by Russ
    [Also on SuperUser - http://superuser.com/questions/116600/can-i-spead-out-a-long-running-stored-proc-accross-multiple-cpus] I have a stored procedure in SQL server the gets, and decrypts a block of data. ( Credit cards in this case. ) Most of the time, the performance is tolerable, but there are a couple customers where the process is painfully slow, taking literally 1 minute to complete. ( Well, 59377ms to return from SQL Server to be exact, but it can vary by a few hundred ms based on load ) When I watch the process, I see that SQL is only using a single proc to perform the whole process, and typically only proc 0. Is there a way I can change my stored proc so that SQL can multi-thread the process? Is it even feasible to cheat and to break the calls in half, ( top 50%, bottom 50% ), and spread the load, as a gross hack? ( just spit-balling here ) My stored proc: USE [Commerce] GO /****** Object: StoredProcedure [dbo].[GetAllCreditCardsByCustomerId] Script Date: 03/05/2010 11:50:14 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[GetAllCreditCardsByCustomerId] @companyId UNIQUEIDENTIFIER, @DecryptionKey NVARCHAR (MAX) AS SET NoCount ON DECLARE @cardId uniqueidentifier DECLARE @tmpdecryptedCardData VarChar(MAX); DECLARE @decryptedCardData VarChar(MAX); DECLARE @tmpTable as Table ( CardId uniqueidentifier, DecryptedCard NVarChar(Max) ) DECLARE creditCards CURSOR FAST_FORWARD READ_ONLY FOR Select cardId from CreditCards where companyId = @companyId and Active=1 order by addedBy desc --2 OPEN creditCards --3 FETCH creditCards INTO @cardId -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN --OPEN creditCards DECLARE creditCardData CURSOR FAST_FORWARD READ_ONLY FOR select convert(nvarchar(max), DecryptByCert(Cert_Id('Oh-Nay-Nay'), EncryptedCard, @DecryptionKey)) FROM CreditCardData where cardid = @cardId order by valueOrder OPEN creditCardData FETCH creditCardData INTO @tmpdecryptedCardData -- prime the cursor WHILE @@Fetch_Status = 0 BEGIN print 'CreditCardData' print @tmpdecryptedCardData set @decryptedCardData = ISNULL(@decryptedCardData, '') + @tmpdecryptedCardData print '@decryptedCardData' print @decryptedCardData; FETCH NEXT FROM creditCardData INTO @tmpdecryptedCardData -- fetch next END CLOSE creditCardData DEALLOCATE creditCardData insert into @tmpTable (CardId, DecryptedCard) values ( @cardId, @decryptedCardData ) set @decryptedCardData = '' FETCH NEXT FROM creditCards INTO @cardId -- fetch next END select CardId, DecryptedCard FROM @tmpTable CLOSE creditCards DEALLOCATE creditCards

    Read the article

  • iPhone: CPU power to do DSP/Fourier transform/frequency domain?

    - by mahboudz
    I want to analyze MIC audio on an ongoing basis (not just a snipper or prerecorded sample), and display frequency graph and filter out certain aspects of the audio. Is the iPhone powerful enough for that? I suspect the answer is a yes, given the Google and iPhone voice recognition, Shazaam and other music recognition apps, and guitar tuner apps out there. However, I don't know what limitations I'll have to deal with. Anyone play around with this area?

    Read the article

  • What is the correct usage of blueprint-typography-body([$font-size])?

    - by Alexis Abril
    Recent convert to RoR and I've been using Compass w/ Blueprint to dip into the proverbial pool. Compass has been fantastic, but I've come across something strange within the Typography library. The blueprint-typography-body mixin contains the following: =blueprint-typography-body($font-size: $blueprint-font-size) line-height: 1.5 +normal-text font-size: 100% * $font-size / 16px My question revolves around "font-size." I'm a bit lost, as I would expect to pass in a font size and have that size reflected upon page load. However, in this scenario the formula seems to dictate a percentage against the default font. ie: +blueprint-typography-body(10px) //produces 7.5px off of the default font size of 12px from what I can tell. In essence, I'm curious if there is a standard to setting font size within Compass other than explicitly declaring "font-size: 10px". Note: The reason I'm leaning towards Blueprint/Compass font stylings is due to the standardization of line-heights, fonts and colors.

    Read the article

  • Need to optimize this PHP script for "recent posts". Fatal error when post count is high...

    - by Scott B
    The code below is resulting in an error on a site in which there are ~ 1500 posts. It performs fine when post count is nominal, however, this heavy load is exposing the weakness of the code and I'd like to optimize it. Interestingly, when I disable this menu and instead use the "Recent Posts" widget, the posts are drawn fine. So I'd probably do good to borrow from that code if I knew where to find it, or better yet, If I could call the widget directly in my theme, passing it a post count variable. Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 16384 bytes) in /home1/est/public_html/mysite/wp-includes/post.php on line 3462 The code is below. Its purpose is to list "recent posts". global $post; $cat=get_cat_ID('myMenu'); $cathidePost=get_cat_ID('hidePost'); $myrecentposts = get_posts(array('post_not_in' => get_option('sticky_posts'), 'cat' => "-$cat,-$cathidePost",'showposts' => $count-of-posts)); $myrecentposts2 = get_posts(array('post_not_in' => get_option('sticky_posts'), 'cat' => "-$cat,-$cathidePost",'showposts' => -1)); $myrecentpostscount = count($myrecentposts2); if ($myrecentpostscount > 0) { ?> <div class="recentPosts"><h4><?php if ($myHeading !=="") { echo $myHeading; } else { echo "Recent Posts";} ?></h4><ul> <?php $current_page_recent = get_post( $current_page ); foreach($myrecentposts as $idxrecent=>$post) { if($post->ID == $current_page_recent->ID) { $home_menu_recent = ' class="current_page_item'; } else { $home_menu_recent = ' class="page_item'; } $myclassrecent = ($idxrecent == count($myrecentposts) - 1 ? $home_menu_recent.' last"' : $home_menu_recent.'"'); ?> <li<?php echo $myclassrecent ?>><a href="<?php the_permalink(); ?>"><?php the_title(); ?></a></li> <?php } ; if (($myrecentpostscount > $count-of-posts) && $count-of-posts > -1){ ?><li><a href="<?php bloginfo('url'); ?>/recent">View All Posts</a></li><?php } ?></ul></div>

    Read the article

  • How do I simulate the usage of a sequence of web pages?

    - by Rory Becker
    I have a simple sequence of web pages written in ASP.Net 3.5 SP1. Page1 - A Logon Form.... txtUsername, txtPassword and cmdLogon Page2 - A Menu (created using DevExpress ASP.Net controls) Page3 - The page redirected to by the server in the event that the user picks the right menu option in Page2 I would like to create a threaded program to simulate many users trying to use this sequence of pages. I have managed to create a host Winforms app which Launches a new thread for each "User" I have further managed to work out the basics of WebRequest enough to perform a request which retrieves the Logon page itself. Dim Request As HttpWebRequest = TryCast(WebRequest.Create("http://MyURL/Logon.aspx"), HttpWebRequest) Dim Response As HttpWebResponse = TryCast(Request.GetResponse(), HttpWebResponse) Dim ResponseStream As StreamReader = New StreamReader(Response.GetResponseStream(), Encoding.GetEncoding(1252)) Dim HTMLResponse As String = ResponseStream.ReadToEnd() Response.Close() ResponseStream.Close() Next I need to simulate the user having entered information into the 2 TextBoxes and pressing logon.... I have a hunch this requires me to add the right sort of "PostData" to the request. before submitting. However I'm also concerned that "ViewState" may be an issue. Am I correct regarding the PostData? How do I add the postData to the request? Do I need to be concerned about Viewstate? Update: While I appreciate that Selenium or similar products are useful for acceptance testing , I find that they are rather clumsy for what amounts to load testing. I would prefer not to load 100 instances of Firefox or IE in order to simulate 100 users hitting my site. This was the reason I was hoping to take the ASPNet HttpWebRequest route.

    Read the article

  • Perl+Image::Magick usage: how to assemble several areas in one image into a new image?

    - by Jin
    Hi Everybody, I'm new to ImageMagick and haven't figured out how to assemble several areas into a new image. E.g., I know the "geometry" of words "hello" and "world" respectively in an image, what I need to do is to retrieve the word images and put then into one line image while keep their relative positions. Question1: Suppose I use the perl API, how should I use Composite() or other correct methods to do this? my $geom = sprintf('%dx%x+%d+%d', $word->{width}, $word->{height}, $offsetx, $offsety); $x = $lineimg->Composite($wordimg, $geom); warn "$x" if "$x"; Suppose $lineimg's size is big enough to hold all word images and the geometry has been computed. This code gives out a complain by ImageMagick: Exception 410: composite image required `Image::Magick' @ Magick.xs/XS_Image__Magick_Mogrify/7790 ... Question2: currently I only know how to crop a word image out of the original one and then Clone() method to restore the original image. Is there a way to copy instead of crop a specific area from a image? This way can save the time to copy back and forth the whole image several times. Does anybody know how to write this kind of processing? I appreciate all your help and suggestions! -Jin

    Read the article

  • Do complex JOINs cause high coupling and maintenance problems ?

    - by ashkan.kh.nazary
    Our project has ~40 tables with complex relations.A colleague believes in using long join queries which enforces me to learn about tables outside of my module but I think I should not concern about tables not directly related to my module and use data access functions (written by those responsible for other modules) when I need data from them. Let me clarify: I am responsible for the ContactVendor module which enables the customers to contact the vendor and start a conversation about some specific product. Products module has it's own complex tables and relations with functions that encapsulate details (for example i18n, activation, product availability etc ...). Now I need to show the product title of some product related to some conversation between the vendor and customers. I may either write a long query that retrieves the product info along with conversation stuff in one shot (which enforces me to learn about Product tables) OR I may pass the relevant product_id to the get_product_info(int) function. First approach is obviously demanding and introduces many bad practices and things I normally consider fault in programming. The problem with the second approach seems to be the countless mini queries these access functions cause and performance loss is a concern when a loop tries to fetch product titles for 100 products using functions that each perform a separate query. So I'm stuck between "don't code to the implementation, code to interface" and performance. What is the right way of doing things ? UPDATE: I'm specially concerned about possible future modifications to those tables outside of my module. What if the Products module decided to change the way they are doing things? or for some reason modify the schema? It means some other modules would break or malfunction until the change is integrated to them. The usual ripple effect problem.

    Read the article

  • Does complex JOINs causes high coupling and maintenance problems ?

    - by ashkan.kh.nazary
    Our project has ~40 tables with complex relations.A colleague believes in using long join queries which enforces me to learn about tables outside of my module but I think I should not concern about tables not directly related to my module and use data access functions (written by those responsible for other modules) when I need data from them. Let me clarify: I am responsible for the ContactVendor module which enables the customers to contact the vendor and start a conversation about some specific product. Products module has it's own complex tables and relations with functions that encapsulate details (for example i18n, activation, product availability etc ...). Now I need to show the product title of some product related to some conversation between the vendor and customers. I may either write a long query that retrieves the product info along with conversation stuff in one shot (which enforces me to learn about Product tables) OR I may pass the relevant product_id to the get_product_info(int) function. First approach is obviously demanding and introduces many bad practices and things I normally consider fault in programming. The problem with the second approach seems to be the countless mini queries these access functions cause and performance loss is a concern when a loop tries to fetch product titles for 100 products using functions that each perform a separate query. So I'm stuck between "don't code to the implementation, code to interface" and performance. What is the right way of doing things ? UPDATE: I'm specially concerned about possible future modifications to those tables outside of my module. What if the Products module decided to change the way they are doing things? or for some reason modify the schema? It means some other modules would break or malfunction until the change is integrated to them. The usual ripple effect problem.

    Read the article

  • Data Modeling Help - Do I add another table, change existing table's usage, or something else?

    - by StackOverflowNewbie
    Assume I have the following tables and relationships: Person - Id (PK) - Name A Person can have 0 or more pets: Pet - Id (PK) - PersonId (FK) - Name A person can have 0 or more attributes (e.g. age, height, weight): PersonAttribute _ Id (PK) - PersonId (FK) - Name - Value PROBLEM: I need to represent pet attributes, too. As it turns out, these pet attributes are, in most cases, identical to the attributes of a person (e.g. a pet can have an age, height, and weight too). How do I represent pet attributes? Do I create a PetAttribute table? PetAttribute Id (PK) PetId (FK) Name Value Do I change PersonAttribute to GenericAttribute and have 2 foreign keys in it - one connecting to Person, the other connecting to Pet? GenericAttribute Id (PK) PersonId (FK) PetId (FK) Name Value NOTE: if PersonId is set, then PetId is not set. If PetId is set, PersonId is not set. Do something else?

    Read the article

< Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >