Search Results

Search found 236 results on 10 pages for 'cesar downs'.

Page 8/10 | < Previous Page | 4 5 6 7 8 9 10  | Next Page >

  • Site Goes Offline Every Day At Midnight - No One Knows Why

    - by HollerTrain
    0 down vote favorite Seems today a website I manage has been going online and offline between 12a and 12:25a. I have no idea what is causing the issue so I am seeking guidance on where to start. It is a Wordpress based site. So here is what I DO know: I have a pingdom account which alerts me when the site goes offline so we can see every day, like clockwork, the site goes on/off. At the time of the ups/downs I see a lot of strain on the memory usage. Look at the load average when the site is going online/offline (http://screencast.com/t/BRlfXkqrbJII). Then I ran this command to restart http (http://screencast.com/t/usVtYWZ2Qi) and the memory usage then goes down to this (http://screencast.com/t/VdTIy3bgZiQB). An hour after I restarted http, the site then went offline/online so restarting the http didn't do much help. When the site is going offline/online, I ran the top command and get this (http://screencast.com/t/zEwr7YQj3). Here is a top command when the site is at it's lowest (http://screencast.com/t/eaMfha9lbT - so this would be dubbged "normal"). Here is a bandwidth report (http://screencast.com/t/AS0h2CH1Gypq). The traffic doesn't seem to be that much (http://screencast.com/t/s7hrWNNic1K), but looking at my times the site is going up/down this may be one of the reasons? I have the dvp Nitro package at Media Temple (http://mediatemple.net/webhosting/nitro/). So at this point I would request some help in trying to figure out what the cause of this is, and how I can go about pinpointing this issue. ANY HELP is greatly appreciated.

    Read the article

  • Migrating WebLogic 10.3.0 to new host. Slow managed server startup times

    - by wadevondoom
    We are migrating our Blue Martini Commerce application (only supported on WebLogic 10.3.0) to a new host (Redhat 6.3 on a VMWare ESX vm). We are seeing extremely slow start up times for our managed server(s) that is basically 20x slower than our current production. As a for instance the Publish managed server takes ~30 - 45 seconds in current production and in the new environment it takes ~10 minutes. The setup uses the same domain structure and JVM as the current production environment. The same setup files are used. We use jdk1.6.0_33 on 64 bit architecture. We used the generic 64bit weblogic installer and used pack / unpack utilities to migrate the domain. The JAVA_OPTS to start this server are: "-d64 -Xms256m -Xmx512m -XX:PermSize=48m -XX:MaxPermSize=256m" The sysadmins have checked /etc/sysctl.conf and /etc/limits.conf to ensure we were not hitting some kind of process limit. As I am not sure what this managed server does from a Blue Martini perspective during the phase of startup I also had the DBA check to ensure that Oracle RAC (11.2.0.3) wasn't also hitting some kind of process limit or if there was a tns listener issue. The new host is quite a bit stricter with their server lock downs so there are a few differences.... Redhat 6.3 in new env, RH 5.7 in current SElinux is targeted in new env and disabled in current VM in new env and dedicated hardware in current iptables disabled in current. It was enabled in new prod but I had them disable it just in case I apologize for not being more specific. I am mostly hoping got some tips. I do not have the typical root access I would normally have in this environment. I am just hoping got a path forward. I did a few 'kill -3' to see if there are blocked threads and I got nadda. The service works for all intents and purposes it is just painfully slow. Thanks you all in advance for reading and best regards. Wade

    Read the article

  • Microsoft Mouse and Keyboard Center - Slow response for App-specific shortcuts

    - by Darrel Hoffman
    So a few months ago, I bought a new MS mouse, and was surprised that they'd discontinued Intellipoint in favor of this Microsoft Mouse and Keyboard Center. It seems to have the same functionality underneath all the bloat, but there's a very serious drawback - when I set up application-specific functions for the extra buttons on the mouse, they work, but sometimes with a very long delay, like up to a minute or more. For example, I often set up the left side button as an "Undo" in various programs for convenience. But sometimes, when I try to use that Undo button, nothing happens, so I'm forced to use the standard Ctrl-Z or whatever. But then, a minute or so later, it suddenly remembers that I hit that button a while back, and calls the Undo unexpectedly on something entirely different. It's infuriating. No modern computer function should be this slow. It's not the software or the computer itself, because doing an Undo via Ctrl-Z or the menu still works instantly. It's very definitely a side-effect of delayed response to the mouse button. Usually after it delays the first time, it'll work quickly after that, but if you haven't used a given shortcut in several minutes, it "forgets" again and you get another inexplicably long delay. Intellipoint never had this problem, but it's not supported any more, and not compatible with the newer mice. Has anyone else noticed slow-downs with MS M&K C and app-specific shortcuts? Any ideas how to get around this? I use these shortcuts extensively in my workflow and it's just entirely unacceptable to have such a long delay in what should be a pretty basic feature.

    Read the article

  • Selecting whole column except first X (header) cells in Excel

    - by Robert Koritnik
    I know I can select all cells in a particular column by clicking on column header descriptor (ie. A or AB). But is it possible to then exclude a few cells out of it, like my data table headings? Example I would like to select data cells of a particular column to set Data Validation (that would eventually display a drop down of list values defined in a named range). But I don't want my data header cells to be included in this selection (so they won't have these drop downs displayed nor will they be validated). What if I later decide to change validation settings of these cells? How can I selection my column then? A sidenote I know I can set data validation on the whole column and then select only those cells that I want to exclude and clear their data validation. What I would like to know is is ti possible to do the correct selection in the first step to avoid this second one. I tried clicking on the column descriptor to select the whole column and then CTRL-click those cells I don't want to include in my selection, but it didn't work as expected.

    Read the article

  • How to improve network performance between two Win 2008 KMV guest having virtio driver already?

    - by taazaa
    I have two physical servers with Ubuntu 10.04 server on them. They are connected with a 1Gbps card over a gigabit switch. Each of these host servers has one Win 2008 guest VM. Both VMs are well provisioned (4 cores, 12GB RAM), RAW disks. My asp.net/sql server applications are running much slower compared to very similar physical setups. Both machines are setup to use virtio for disk and network. I used iperf to check network performance and I get: Physical host 1 ----- Physical Host 2: 957 Mbits/sec Physical host 1 ----- Win 08 Guest 1: 557 Mbits/sec Win 08 Guest 1 ----- Phy host 1: 182 Mbits/sec Win 08 Guest 1 ----- Win 08 Guest 2: 111 Mbits /sec My app is running on Win08 Guest 1 and Guest 2 (web and db). There is a huge drop in network throughput (almost 90%) between the two guest. Further the throughput does not seem to be symmetric between host and guest as well. The CPU utilization on the guests and hosts is less than 2% right now (we are just testing right now). Apart from this, there have been random slow downs in the network to as low as 1 Mbits/sec making the whole application unusable. Any help to trouble shoot this would be appreciated.

    Read the article

  • Screen recording in Windows 8 makes PC unusable?

    - by Skadier
    OS used: Windows 8 Pro x64 Hi there, I got a weird problem. I tried to record my screen using different screen recording applications like Camtasia7, Hypercam, and some others. So if I hit "record" everything works fine for about 3-4 seconds. Then the PC slows down heavily caused by extremly high IO usage. The PC gets nearly unusable and laggs like hell. I can't switch applications, can't get TaskManager with Ctrl+Alt+Del, or something else without waiting 1-3 Minutes. After about 5-8 Minutes and a bit of luck I am sometimes able to end the process of the recorder and IO calms down very slow(100% IO lasts for about 2 Minutes before becoming normal). I don't know why this happens. Screen recorders that support Windows 8 aren't out afaik but at least Camtasia7 worked with Windows 8 Developer Preview and Windows 8 Release Preview (only had to change record method to .avi). No slow downs or something else. Is there anyone out there who knows this problem too? Is there maybe a solution to this problem?

    Read the article

  • Some VS 2010 RC Updates (including patches for Intellisense and Web Designer fixes)

    - by ScottGu
    [In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu] We are continuing to make progress on shipping Visual Studio 2010.  I’d like to say a big thank you to everyone who has downloaded and tried out the VS 2010 Release Candidate, and especially to those who have sent us feedback or reported issues with it. This data has been invaluable in helping us find and fix remaining bugs before we ship the final release. Last month I blogged about a patch we released for the VS 2010 RC that fixed a bad intellisense crash issue.  This past week we released two additional patches that you can download and apply to the VS 2010 RC to immediately fix two other common issues we’ve seen people run into: Patch that fixes crashes with Tooltip invocation and when hovering over identifiers The Visual Studio team recently released a second patch that fixes some crashes we’ve seen when tooltips are displayed – most commonly when hovering over an identifier to view a QuickInfo tooltip. You can learn more about this issue from this blog post, and download and apply the patch here. Patch that fixes issues with the Web Forms designer not correctly adding controls to the auto-generated designer files The Visual Web Developer team recently released a patch that fixes issues where web controls are not correctly added to the .designer.cs file associated with the .aspx file – which means they can’t be programmed against in the code-behind file.  This issue is most commonly described as “controls are not being recognized in the code-behind” or “editing existing .aspx files regenerates the .aspx.designer.(vb or cs) file and controls are now missing” or “I can’t embed controls within the Ajax Control Toolkit TabContainer or the <asp:createuserwizard> control”. You can learn more about the issue here, and download the patch that fixes it here. Common Cause of Intellisense and IDE sluggishness on Windows XP, Vista, Win Server 2003/2008 systems Over the last few months we’ve occasionally seen reports of people seeing tremendous slowness when typing and using intellisense within VS 2010 despite running on decent machines.  It took us awhile to track down the cause – but we have found that the common culprit seems to be that these machines don’t have the latest versions of the UIA (Windows Automation) component installed. UIA 3 ships with Windows 7, and is a recommended Windows Update patch on XP and Vista (which is why we didn’t see the problem in our tests – since our machines are patched with all recommended updates).  Many systems (especially on XP) don’t automatically install recommended updates, though, and are running with older versions of UIA. This can cause significant performance slow-downs within the VS 2010 editor when large lists are displayed (for example: with intellisense). If you are running on Windows XP, Vista, or Windows Server 2003 or 2008 and are seeing any performance issues with the editor or IDE, please install the free UIA 3 update that can be downloaded from this page.  If you scroll down the page you’ll find direct links to versions for each OS. Note that we are making improvements to the final release of VS 2010 so that we don’t have big perf issues when UIA 3 isn’t installed – and we are also adding a message within the IDE that will warn you if you don’t have UIA 3 installed and accessibility is activated. Improved Text Rendering with WPF 4 and VS 2010 We recently made some nice changes to WPF 4 which improve the text clarity and text crispness over what was in the VS 2010/.NET 4 Release Candidate.  In particular these changes improve scenarios where you have a dark background with light text. You can learn more about these improvements in this WPF Team blog post.  These changes will be in the final release of VS 2010 and .NET 4. Hope this helps, Scott

    Read the article

  • Knockout.js - Filtering, Sorting, and Paging

    - by jtimperley
    Originally posted on: http://geekswithblogs.net/jtimperley/archive/2013/07/28/knockout.js---filtering-sorting-and-paging.aspxKnockout.js is fantastic! Maybe I missed it but it appears to be missing flexible filtering, sorting, and pagination of its grids. This is a summary of my attempt at creating this functionality which has been working out amazingly well for my purposes. Before you continue, this post is not intended to teach you the basics of Knockout. They have already created a fantastic tutorial for this purpose. You'd be wise to review this before you continue. http://learn.knockoutjs.com/ Please view the full source code and functional example on jsFiddle. Below you will find a brief explanation of some of the components. http://jsfiddle.net/JTimperley/pyCTN/13/ First we need to create a model to represent our records. This model is a simple container with defined and guaranteed members. function CustomerModel(data) { if (!data) { data = {}; } var self = this; self.id = data.id; self.name = data.name; self.status = data.status; } Next we need a model to represent the page as a whole with an array of the previously defined records. I have intentionally overlooked the filtering and sorting options for now. Note how the filtering, sorting, and pagination are chained together to accomplish all three goals. This strategy allows each of these pieces to be used selectively based on the page's needs. If you only need sorting, just sort, etc. function CustomerPageModel(data) { if (!data) { data = {}; } var self = this; self.customers = ExtractModels(self, data.customers, CustomerModel); var filters = […]; var sortOptions = […]; self.filter = new FilterModel(filters, self.customers); self.sorter = new SorterModel(sortOptions, self.filter.filteredRecords); self.pager = new PagerModel(self.sorter.orderedRecords); } The code currently supports text box and drop down filters. Text box filters require defining the current 'Value' and the 'RecordValue' function to retrieve the filterable value from the provided record. Drop downs allow defining all possible values, the current option, and the 'RecordValue' as before. Once defining these filters, they are automatically added to the screen and any changes to their values will automatically update the results, causing their sort and pagination to be re-evaluated. var filters = [ { Type: "text", Name: "Name", Value: ko.observable(""), RecordValue: function(record) { return record.name; } }, { Type: "select", Name: "Status", Options: [ GetOption("All", "All", null), GetOption("New", "New", true), GetOption("Recently Modified", "Recently Modified", false) ], CurrentOption: ko.observable(), RecordValue: function(record) { return record.status; } } ]; Sort options are more simplistic and are also automatically added to the screen. Simply provide each option's name and value for the sort drop down as well as function to allow defining how the records are compared. This mechanism can easily be adapted for using table headers as the sort triggers. That strategy hasn't crossed my functionality needs at this point. var sortOptions = [ { Name: "Name", Value: "Name", Sort: function(left, right) { return CompareCaseInsensitive(left.name, right.name); } } ]; Paging options are completely contained by the pager model. Because we will be chaining arrays between our filtering, sorting, and pagination models, the following utility method is used to prevent errors when handing an observable array to another observable array. function GetObservableArray(array) { if (typeof(array) == 'function') { return array; }   return ko.observableArray(array); }

    Read the article

  • New Release of Oracle EPM (Enterprise Performance Management)

    - by Theresa Hickman
    I'm a huge fan of Hyperion products and consider Hyperion to be one of the best acquisitions Oracle has made in terms of applications. So I am really excited to talk about their latest release, Release 11.1.2 of the Oracle EPM System. This is EPM's largest release in 2 years, and it's jam-packed with new modules and features. In terms of brand new products, there are three: 1. Public Sector Planning and Budgeting meets the needs of public sector agencies, higher education, governments, etc. that have complex budget requirements. It supports position or employee-based budgeting and integrates with MS Office and your ERP ledgers to perform commitment control. 2. Hyperion Financial Close Management is a complete financial close solution that orchestrates the entire close process from subledgers and general ledger to financial reporting and disclosure submissions. And of course, it is integrated with GL systems and consolidation systems. I saw a demo of this and it looked pretty slick. They have this unified close calendar that looks like a regular calendar that gives each person participating in the close process a task list. It comes with a Gantt chart that shows the relationships and dependencies among closing tasks. There are dashboards to allow you to track the close progress and completion of tasks as well as perform trend analysis and see how much time is being spent on different activities in the close process. This gives you visibility that you never had before to understand where the bottlenecks are and where improvements could be made. I think what I liked best about this product was that it provides a central place for all participants to communicate their progress. When I worked as an Accountant, we used ad hoc tools, such as spreadsheets, Word documents, emails, and phone calls during the close process. I like the idea of having a central system to track the overall progress as well as automate the entire financial close process. Who knows, maybe Accountants won't have to revolve their lives around the month end close anymore with a tool like this. Those periodic fire drills can become predictable, well managed processes. 3. Disclosure Management is an out-of-the-box, pre-packaged XBRL solution to meet statutory reporting requirements. This product is really going to help companies improve the timeliness of producing financial reports. Reports can be authored using MS Word and Excel and then XBRL instance documents can be produced with its embedded XBRL tags. It even supports footnotes and disclosures of non-financial information. With a product like this, companies no longer have to outsource their XBRL filing; they can bring it back in house to save costs and time. In terms of other enhancements, they have ERP Integrator that provides integration and drill downs from Hyperion products to source systems, such as Oracle E-Business Suite, PeopleSoft, and SAP. No other vendor offers this level of integration. There's also a new product that links Oracle Essbase directly to Hyperion Financial Management for internal financial reporting, and new integrations between Hyperion Financial Management and Oracle's GRC products. They also improved the usability of Oracle Hyperion Planning. They made it much easier for end users to use the system via the web or via MS Excel when submitting plans and budgets. It is also integrated with intelligent approval workflows that are data-driven, user-configurable, and scenario-specific to efficiently streamline the budgeting process. Here's the press release from April 7, 2010. Here's the pre-recorded web cast where you can see the demos. Just register and watch the hour long presentation. And finally, here's the newsletter

    Read the article

  • T-SQL Tuesday #005 : SSRS Parameters and MDX Data Sets

    - by blakmk
    Well it this weeks  T-SQL Tuesday #005  topic seems quite fitting. Having spent the past few weeks creating reports and dashboards in SSRS and SSAS 2008, I was frustrated by how difficult it is to use custom datasets to generate parameter drill downs. It also seems Reporting Services can be quite unforgiving when it comes to renaming things like datasets, so I want to share a couple of techniques that I found useful. One of the things I regularly do is to add parameters to the querys. However doing this causes Reporting Services to generate a hidden dataset and parameter name for you. One of the things I like to do is tweak these hidden datasets removing the ‘ALL’ level which is a tip I picked up from Devin Knight in his blog: There are some rules i’ve developed for myself since working with SSRS and MDX, they may not be the best or only way but they work for me. Rule 1 – Never trust the automatically generated hidden datasets Or even ANY, automatically generated MDX queries for that matter.... I’ve previously blogged about this here.   If you examine the MDX generated in the hidden dataset you will see that it generates the MDX in the context of the originiating query by building a subcube, this mean it may NOT be appropriate to use this in a subsequent query which has a different context. Make sure you always understand what is going on. Often when i’m developing a dashboard or a report there are several parameter oriented datasets that I like to manually create. It can be that I have different datasets using the same dimension but in a different context. One example of this, is that I often use a dataset for last month and a dataset for the last 6 months. Both use the same date hierarchy. However Reporting Services seems not to be too smart when it comes to generating unique datasets when working with and renaming parameters and datasets. Very often I have come across this error when it comes to refactoring parameter names and default datasets. "an item with the same key has already been added" The only way I’ve found to reliably avoid this is to obey to rule 2. Rule 2 – Follow this sequence when it comes to working with Parameters and DataSets: 1.    Create Lookup and Default Datasets in advance 2.    Create parameters (set the datasets for available and default values) 3.    Go into query and tick parameter check box 4.    On dataset properties screen, select the parameter defined earlier from the parameter value defined earlier. Rule 3 – Dont tear your hair out when you have just renamed objects and your report doesn’t build Just use XML notepad on the original report file. I found I gained a good understanding of the structure of the underlying XML document just by using XML notepad. From this you can do a search and find references of the missing object. You can also just do a wholesale search and replace (after taking a backup copy of course ;-) So I hope the above help to save the sanity of anyone who regularly works with SSRS and MDX.   @Blakmk

    Read the article

  • SSRS Parameters and MDX Data Sets

    - by blakmk
    Having spent the past few weeks creating reports and dashboards in SSRS and SSAS 2008, I was frustrated by how difficult it is to use custom datasets to generate parameter drill downs. It also seems Reporting Services can be quite unforgiving when it comes to renaming things like datasets, so I want to share a couple of techniques that I found useful. One of the things I regularly do is to add parameters to the querys. However doing this causes Reporting Services to generate a hidden dataset and parameter name for you. One of the things I like to do is tweak these hidden datasets removing the ‘ALL’ level which is a tip I picked up from Devin Knight in his blog: There are some rules i’ve developed for myself since working with SSRS and MDX, they may not be the best or only way but they work for me. Rule 1 – Never trust the automatically generated hidden datasets Or even ANY, automatically generated MDX queries for that matter.... I’ve previously blogged about this here.   If you examine the MDX generated in the hidden dataset you will see that it generates the MDX in the context of the originiating query by building a subcube, this mean it may NOT be appropriate to use this in a subsequent query which has a different context. Make sure you always understand what is going on. Often when i’m developing a dashboard or a report there are several parameter oriented datasets that I like to manually create. It can be that I have different datasets using the same dimension but in a different context. One example of this, is that I often use a dataset for last month and a dataset for the last 6 months. Both use the same date hierarchy. However Reporting Services seems not to be too smart when it comes to generating unique datasets when working with and renaming parameters and datasets. Very often I have come across this error when it comes to refactoring parameter names and default datasets. "an item with the same key has already been added" The only way I’ve found to reliably avoid this is to obey to rule 2. Rule 2 – Follow this sequence when it comes to working with Parameters and DataSets: 1.    Create Lookup and Default Datasets in advance 2.    Create parameters (set the datasets for available and default values) 3.    Go into query and tick parameter check box 4.    On dataset properties screen, select the parameter defined earlier from the parameter value defined earlier. Rule 3 – Dont tear your hair out when you have just renamed objects and your report doesn’t build Just use XML notepad on the original report file. I found I gained a good understanding of the structure of the underlying XML document just by using XML notepad. From this you can do a search and find references of the missing object. You can also just do a wholesale search and replace (after taking a backup copy of course ;-) So I hope the above help to save the sanity of anyone who regularly works with SSRS and MDX.

    Read the article

  • Essbase BSO Data Fragmentation

    - by Ann Donahue
    Essbase BSO Data Fragmentation Data fragmentation naturally occurs in Essbase Block Storage (BSO) databases where there are a lot of end user data updates, incremental data loads, many lock and send, and/or many calculations executed.  If an Essbase database starts to experience performance slow-downs, this is an indication that there may be too much fragmentation.  See Chapter 54 Improving Essbase Performance in the Essbase DBA Guide for more details on measuring and eliminating fragmentation: http://docs.oracle.com/cd/E17236_01/epm.1112/esb_dbag/daprcset.html Fragmentation is likely to occur in the following situations: Read/write databases that users are constantly updating data Databases that execute calculations around the clock Databases that frequently update and recalculate dense members Data loads that are poorly designed Databases that contain a significant number of Dynamic Calc and Store members Databases that use an isolation level of uncommitted access with commit block set to zero There are two types of data block fragmentation Free space tracking, which is measured using the Average Fragmentation Quotient statistic. Block order on disk, which is measured using the Average Cluster Ratio statistic. Average Fragmentation Quotient The Average Fragmentation Quotient ratio measures free space in a given database.  As you update and calculate data, empty spaces occur when a block can no longer fit in its original space and will either append at the end of the file or fit in another empty space that is large enough.  These empty spaces take up space in the .PAG files.  The higher the number the more empty spaces you have, therefore, the bigger the .PAG file and the longer it takes to traverse through the .PAG file to get to a particular record.  An Average Fragmentation Quotient value of 3.174765 means the database is 3% fragmented with free space. Average Cluster Ratio Average Cluster Ratio describes the order the blocks actually exist in the database. An Average Cluster Ratio number of 1 means all the blocks are ordered in the correct sequence in the order of the Outline.  As you load data and calculate data blocks, the sequence can start to be out of order.  This is because when you write to a block it may not be able to place back in the exact same spot in the database that it existed before.  The lower this number the more out of order it becomes and the more it affects performance.  An Average Cluster Ratio value of 1 means no fragmentation.  Any value lower than 1 i.e. 0.01032828 means the data blocks are getting further out of order from the outline order. Eliminating Data Block Fragmentation Both types of data block fragmentation can be removed by doing a dense restructure or export/clear/import of the data.  There are two types of dense restructure: 1. Implicit Restructures Implicit dense restructure happens when outline changes are done using EAS Outline Editor or Dimension Build. Essbase restructures create new .PAG files restructuring the data blocks in the .PAG files. When Essbase restructures the data blocks, it regenerates the index automatically so that index entries point to the new data blocks. Empty blocks are NOT removed with implicit restructures. 2. Explicit Restructures Explicit dense restructure happens when a manual initiation of the database restructure is executed. An explicit dense restructure is a full restructure which comprises of a dense restructure as outlined above plus the removal of empty blocks Empty Blocks vs. Fragmentation The existence of empty blocks is not considered fragmentation.  Empty blocks can be created through calc scripts or formulas.  An empty block will add to an existing database block count and will be included in the block counts of the database properties.  There are no statistics for empty blocks.  The only way to determine if empty blocks exist in an Essbase database is to record your current block count, export the entire database, clear the database then import the exported data.  If the block count decreased, the difference is the number of empty blocks that had existed in the database.

    Read the article

  • How can I force Javascript garbage collection in IE? IE is acting very slow after AJAX calls & DOM

    - by RenderIn
    I have a page with chained drop-downs. Choosing an option from the first select populates the second, and choosing an option from the second select returns a table of matching results using the innerHtml function on an empty div on the page. The problem is, once I've made my selections and a considerable amount of data is brought onto the page, all subsequent Javascript on the page runs exceptionally slowly. It seems as if all the data I pulled back via AJAX to populate the div is still hogging a lot of memory. I tried setting the return object which contains the AJAX results to null after calling innerHtml but with no luck. Firefox, Safari, Chrome and Opera all show no performance degradation when I use Javascript to insert a lot of data into the DOM, but in IE it is very apparent. To test that it's a Javascript/DOM issue rather than a plain old IE issue, I created a version of the page that returns all the results on the initial load, rather than via AJAX/Javascript, and found IE had no performance problems. FYI, I'm using jQuery's jQuery.get method to execute the AJAX call. EDIT This is what I'm doing: <script type="text/javascript"> function onFinalSelection() { var searchParameter = jQuery("#second-select").val(); jQuery.get("pageReturningAjax.php", {SEARCH_PARAMETER: searchParameter}, function(data) { jQuery("#result-div").get(0).innerHtml = data; //jQuery("#result-div").html(data); //Tried this, same problem data = null; }, "html"); } </script>

    Read the article

  • What is the right way to scale a Flex application up to fullscreen?

    - by Impirator
    Fullscreen mode and I have been battling for a while in this Flex application, and I'm coming up short on Google results to end my woes. I have no problem going into fullscreen mode by doing a Application.application.stage.displayState = StageDisplayState.FULL_SCREEN;, but the rest of the content just sits there in the top, left corner at it's original size. All right, says I, I'll just do a stage.scaleMode = StageScaleMode.SHOW_ALL and make it figure out how to pull this off. And it looks like it does. Except that when you mouse over the individual checkboxes and buttons and various components, they all fidget slightly. Just a slight jump up or down as they resize...on mouse over. Well, this is frustrating, but bearable. I can always just invoke invalidateSize() explicitly for all of them. But for the comboboxes. The ones at the bottom have their menus go off the bottom of the screen, and when I pop out of fullscreen mode, their drop downs cut off half way. I have no idea how to fix that. Can someone step in here, and put me out of my misery? What is the right way to scale a Flex application up to fullscreen? var button:Button = button_fullscreen; try { if(stage.displayState == StageDisplayState.FULL_SCREEN) { Application.application.stage.displayState = StageDisplayState.NORMAL; button.label = "View Fullscreen Mode"; stage.scaleMode = StageScaleMode.NO_SCALE; } else { Application.application.stage.displayState = StageDisplayState.FULL_SCREEN; button.label = "Exit Fullscreen Mode"; stage.scaleMode = StageScaleMode.SHOW_ALL; } invalidateSizes(); // Calls invalidateSize() explicitly on several components. } catch(error:SecurityError) { Alert.show("The security settings of your computer prevent this from being displayed in fullscreen.","Error: "+error.name+" #"+error.errorID); } catch(error:Error) { Alert.show(error.message,error.name+" #"+error.errorID); }

    Read the article

  • Data Collection (Offline - no internet) and then syncing it to generate reports from server

    - by Nishant
    So, I have a new project I am planning on taking, and needed to know what skills will be required to achieve this project. The project is to do intensive data collection in the field where they don't have internet access. As part of the data collection, images will be uploaded as part of the data collection which will have to be resized, etc. Once the data collection occurs, this data needs to be consolidated and reported on. I am thinking there are two ways of generating the report. 1. Into a PDF that can be designed. 2. Is there a way to generate an executable file (since the PDF will be huge due to multiple images, etc) and the executable file is navigation friendly with drop-downs etc. It might not be an executable file, but could be a web page or some way that this can be delivered to the client in a friendly professional way. The PDF will have to be generated somehow so that it can be printed as a hard copy. What languages and skill sets will I need to accomplish this project?

    Read the article

  • ASP.NET MVC intermittent slow response

    - by arehman
    Problem In our production environment, system occasionally delays the page response of an ASP.NET MVC application up to 30 seconds or so, even though same page renders in 2-3 seconds most of the times. This happens randomly with any arbitrary page, and GET or POST type requests. For example, log files indicates, system took 15 seconds to complete a request for jquery script file or for other small css file it took 10 secs. Similar Problems: Random Slow Downs Production Environment: Windows Server 2008 - Standard (32-bit) - App Pool running in integrated mode. ASP.NET MVC 1.0 We have tried followings/observations: Moved the application to a stand alone web server, but, it didn't help. We didn't ever notice same issue on the server for any 'ASP.NET' application. App Pool settings are fine. No abrupt recycles/shutdowns. No cpu spikes or memory problems. No delays due to SQL queries or so. It seems as something causing delay along HTTP Pipeline or worker processor seeing the request late. Looking for other suggestions. -- Thanks

    Read the article

  • How do you clear a CustomValidator Error on a Button click event?

    - by George
    I have a composite User control for entering dates: The CustomValidator will include server sided validation code. I would like the error message to be cleared via client sided script if the user alters teh date value in any way. To do this, I included the following code to hook up the two drop downs and the year text box to the validation control: <script type="text/javascript"> ValidatorHookupControlID("<%= ddlMonth.ClientID%>", document.getElementById("<%= CustomValidator1.ClientID%>")); ValidatorHookupControlID("<%= ddlDate.ClientID%>", document.getElementById("<%= CustomValidator1.ClientID%>")); ValidatorHookupControlID("<%= txtYear.ClientID%>", document.getElementById("<%= CustomValidator1.ClientID%>")); </script> However, I would also like the Validation error to be cleared when the user clicks the clear button. When the user clicks the Clear button, the other 3 controls are reset. To avoid a Post back, the Clear button is a regular HTML button with an OnClick event that resets the 3 controls. Unfortunately, the ValidatorHookupControlID method does not seem to work on HTML controls, so I thought to change the HTML Button to an ASP button and to Hookup to that control instead. However, I cannot seem to eliminate the Postback functionality associated by default with the ASP button control. I tried to set the UseSubmitBehavior to False, but it still submits. I tried to return false in my btnClear_OnClick client code, but the code sent to the browser included a DoPostback call after my call. btnClear.Attributes.Add("onClick", "btnClear_OnClick();") Instead of adding OnClick code, I tried overwriting it, but the DoPostBack code was still included in the final code that was sent to the browser. What do I have to do to get the Clear button to clear the CustomValidator error when clicked and avoid a postback? btnClear.Attributes.Item("onClick") = "btnClear_OnClick();"

    Read the article

  • How To Find Reasons of Why Site Goes Online/Offline

    - by HollerTrain
    Seems today a website I manage has been going online and offline throughout the entire day. I have no idea what is causing the issue so I am seeking guidance on where to start. It is a Wordpress based site. So here is what I DO know: I use a program that pings the server every minute and when the server is not responding me it emails me, so I can know exactly when the site is online and offline. The site between 8pm to 12pm 12.28, and around the 1a hour early morning 12.29 (New York City timezone, and all times below are in same timezone). At the time of the ups/downs I see a lot of strain on the memory usage. Look at the load average when the site is going online/offline (http://screencast.com/t/BRlfXkqrbJII). Then I ran this command to restart http (http://screencast.com/t/usVtYWZ2Qi) and the memory usage then goes down to this (http://screencast.com/t/VdTIy3bgZiQB). An hour after I restarted http, the site then went offline/online so restarting the http didn't do much help. When the site is going offline/online, I ran the top command and get this (http://screencast.com/t/zEwr7YQj3). Here is a top command when the site is at it's lowest (http://screencast.com/t/eaMfha9lbT - so this would be dubbged "normal"). Here is a bandwidth report (http://screencast.com/t/AS0h2CH1Gypq). The traffic doesn't seem to be that much (http://screencast.com/t/s7hrWNNic1K), but looking at my times the site is going up/down this may be one of the reasons? I have the dvp Nitro package at Media Temple (http://mediatemple.net/webhosting/nitro/). So at this point I would request some help in trying to figure out what the cause of this is, and how I can go about pinpointing this issue. ANY HELP is greatly appreciated.

    Read the article

  • Calculating a date when a date has been chosen by the number of days

    - by Andy
    I have three selection drop downs in a form comprising of day, month, year. Pretty standard. Ive omitted all the individual select options for the purposes of this question. <label for="start_date">Start Date<font class="required">*</font>:</label> <select name="place_booking[day_val]"> <select name="place_booking[month_val]"> <select name="place_booking[year_val]"> Underneath this i have a selection for the number of days the client wishes to stay at the letting. <label for="number_of_days">Number of Days<font class="required">*</font>:</label> <select name="place_booking[number_of_days]"> Underneath there is a space to display the departure date based on there two selections above. <label for="departure_date">Departure Date<font class="required">*</font>:</label> ? - this bit i would like to display the calculated date after the above is selected Any help would be grealty appreciated.

    Read the article

  • Keeping dates in order when using date_select and discarding year in Rails?

    - by MikeH
    My app has users who have seasonal products. When a user selects a product, we allow him to also select the product's season. We accomplish this by letting him select a start date and an end date for each product. We're using date_select to generate two sets of drop-downs: one for the start date and one for the end date. Including years doesn't make sense for our model. So we're using the option: discard_year => true To explain our problem, consider that our products are apples. Vendor X carries apples every year from September to January. Years are irrelevant here, and that's why we're using discard_year => true. However, while the specific years are irrelevant, the relative point in time from the start date to the end date is relevant. This is where our problem arises. When you use discard_year => true, Rails does set a year in the database, it just doesn't appear in the views. Rails sets all the years to 0001 in our app. Going back to our apple example, this means that the database now thinks the user has selected September 0001 to January 0001. This is a problem for us for a number of reasons. To solve this, the logic that I need to implement is the following: - If season_start month/date is before season_end month/date, then standard Rails approach is fine. - But, if season_start month/date is AFTER season_end month/date, then I need to dynamically update the database field such that the year for season_end is equal to the year for season_start + 1. My best guess is that I would create a custom method that runs as an after_save or after_update in my products model. But I'm not really sure how to do this. Ideas? Anybody ever had this issue? Thanks!

    Read the article

  • WPF - How do I use the UserControl with a dependency property and view model?

    - by user320849
    Hello, My goal is to have a user select a year and a month. Translate the selection into a date and have the user control send the date back to my view model. That part works for me....However, I cannot get the ViewModel's initial date to set those drop downs. public static readonly DependencyProperty Date = DependencyProperty.Register("ReturnDate", typeof(DateTime), typeof(DatePicker), new FrameworkPropertyMetadata{BindsTwoWayByDefault = true,}); public DateTime ReturnDate { get { return Convert.ToDateTime(GetValue(Date)); } set { SetDropDowns(value); SetValue(Date, value); } } The SetDropDowns(value) just sets the selected items on the combo boxes, however, the program never makes it to that method. On the view I am using: <cc1:DatePicker ReturnDate="{Binding Path=StartDate, Mode=TwoWay}" IsStart="True" /> If this has been answered, then my bad. I looked around and didn't see anything that worked for me. Thus, when the program loads how do I get the value from the view model to a method in order to set the combo boxes? Thanks, -Scott

    Read the article

  • What should I do with an over-bloated select-box/drop-down

    - by Tristan Havelick
    All web developers run into this problem when the amount of data in their project grows, and I have yet to see a definitive, intuitive best practice for solving it. When you start a project, you often create forms with tags to help pick related objects for one-to-many relationships. For instance, I might have a system with Neighbors and each Neighbor belongs to a Neighborhood. In version 1 of the application I create an edit user form that has a drop down for selecting users, that simply lists the 5 possible neighborhoods in my geographically limited application. In the beginning, this works great. So long as I have maybe 100 records or less, my select box will load quickly, and be fairly easy to use. However, lets say my application takes off and goes national. Instead of 5 neighborhoods I have 10,000. Suddenly my little drop-down takes forever to load, and once it loads, its hard to find your neighborhood in the massive alphabetically sorted list. Now, in this particular situation, having hierarchical data, and letting users drill down using several dynamically generated drop downs would probably work okay. However, what is the best solution when the objects/records being selected are not hierarchical in nature? In the past, of done this with a popup with a search box, and a list, but this seems clunky and dated. In today's web 2.0 world, what is a good way to find one object amongst many for ones forms? I've considered using an Ajaxifed search box, but this seems to work best for free text, and falls apart a little when the data to be saved is just a reference to another object or record. Feel free to cite specific libraries with generic solutions to this problem, or simply share what you have done in your projects in a more general way

    Read the article

  • How to maintain form state after Post-Redirect-Get in ASP.net?

    - by Ian Boyd
    Imagine a page with a form input: Search Criteria: crackers                   From: [email protected]           To: [email protected]       Subject: How to maintain form state with PRG? Message: Imagine a page with form input:                         Send After the user clicks Send, the server will instruct to client to Redirect, as part of the Post-Redirect-Get pattern. POST /mail/u/compose HTTP/1.1 303 See Other Location: http://stackoverflow.com/mail/u/compose And the client will issue a GET of the new page. The problem is that some elements of the existing form are lost: Search Criteria:                    It gets worse when there are a few drop-downs, and checkboxes. How can i maintain form state in using Post-Redirect-Get in ASP.net, given that the viewstate is then non-existent. Bonus Reading ASP.NET: How to redirect, prefilling form data?

    Read the article

  • populate drop-down values dynamically using Ajax

    - by abhishek
    Hi, I have 3 drop-downs. 1st drop-down contains some values when the page loads. I need to populate 2 nd drop-down based on the value selected in 1st dropdown. Similarly, I need to populate 3 nd drop-down based on the value selected in 1st and 2nd dropdown. Initially I tried like this. <h:selectOneMenu value="#{stu.country}" > <f:selectItems value="#{bean.allCountries}" /> <a4j:support event="onchange" action="#{bean.retrieveStates(stu.country)}" reRender="states_dropDown"></a4j:support> </h:selectOneMenu> //ly, for 2nd drop-down <h:selectOneMenu id="states_dropDown" value="#{stu.state}" > <f:selectItems value="#{bean.allStates}" /> <a4j:support event="onchange" action="#{bean.retrieveCities(stu.country,stu.state)}" reRender="City_dropDown"></a4j:support> </h:selectOneMenu> Some times this code works fine. But some times it doesn't invoke managed bean method. Can you please help??

    Read the article

  • Hibernate JPA Caching Problem, Please help!

    - by Sameer Malhotra
    Ok, Here is my problem. I have a table named Master_Info_tbl. Its a lookup table: Here is the code for the table: @Entity @Table(name="MASTER_INFO_T") public class CodeValue implements java.io.Serializable { private static final long serialVersionUID = -3732397626260983394L; private Integer objectid; private String codetype; private String code; private String shortdesc; private String longdesc; private Integer dptid; private Integer sequen; private Timestamp begindate; private Timestamp enddate; private String username; private Timestamp rowlastchange; //getter Setter methods I have a service layer which calls the method       service.findbycodeType("Code1");   same way this table is queried for the other code types as well e.g. code2, code3 and so on till code10 which gets the result set from the same table and is shown into the drop down of the jsp pages since these drop downs are in 90% of the pages I am thinking to cache them globally. Any idea how to achieve this? FYI: I am using JPA and Hibernate with Struts2 and Spring. The database being used is DB2 UDB8.2 Please help!

    Read the article

< Previous Page | 4 5 6 7 8 9 10  | Next Page >