Search Results

Search found 10099 results on 404 pages for 'yii db'.

Page 8/404 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • SQL SERVER – Maximize Database Performance with DB Optimizer – SQL in Sixty Seconds #054

    - by Pinal Dave
    Performance tuning is an interesting concept and everybody evaluates it differently. Every developer and DBA have different opinion about how one can do performance tuning. I personally believe performance tuning is a three step process Understanding the Query Identifying the Bottleneck Implementing the Fix While, we are working with large database application and it suddenly starts to slow down. We are all under stress about how we can get back the database back to normal speed. Most of the time we do not have enough time to do deep analysis of what is going wrong as well what will fix the problem. Our primary goal at that time is to just fix the database problem as fast as we can. However, here is one very important thing which we need to keep in our mind is that when we do quick fix, it should not create any further issue with other parts of the system. When time is essence and we want to do deep analysis of our system to give us the best solution we often tend to make mistakes. Sometimes we make mistakes as we do not have proper time to analysis the entire system. Here is what I do when I face such a situation – I take the help of DB Optimizer. It is a fantastic tool and does superlative performance tuning of the system. Everytime when I talk about performance tuning tool, the initial reaction of the people is that they do not want to try this as they believe it requires lots of the learning of the tool before they use it. It is absolutely not true with the case of the DB optimizer. It is a very easy to use and self intuitive tool. Once can get going with the product, in no time. Here is a quick video I have build where I demonstrate how we can identify what index is missing for query and how we can quickly create the index. Entire three steps of the query tuning are completed in less than 60 seconds. If you are into performance tuning and query optimization you should download DB Optimizer and give it a go. Let us see the same concept in following SQL in Sixty Seconds Video: You can Download DB Optimizer and reproduce the same Sixty Seconds experience. Related Tips in SQL in Sixty Seconds: Performance Tuning – Part 1 of 2 – Getting Started and Configuration Performance Tuning – Part 2 of 2 – Analysis, Detection, Tuning and Optimizing What would you like to see in the next SQL in Sixty Seconds video? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Interview Questions and Answers, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology, Video Tagged: Identity

    Read the article

  • CPU DB like IMDB for Microprocessors

    - by Jason Fitzpatrick
    If you’re interested in the history of microprocessors, the CPU DB at Stanford is a massive database of microprocessors that covers everything from code names to speed to processor families. Play with their visuals or download the entire database and make your own. CPU DB [Stanford.edu] The Best Free Portable Apps for Your Flash Drive Toolkit How to Own Your Own Website (Even If You Can’t Build One) Pt 3 How to Sync Your Media Across Your Entire House with XBMC

    Read the article

  • Magento, NGINX, PHP-FPM, APC, MEMCACHED, 16gb Ram CentOS, Spiking PHP-FPM to 100% CPU

    - by Terry Dunford
    I have been trying to resolve my issue of spiking cpu caused by php-fpm processes. I've reduced the php-fpm config settings to: pm = ondemand pm.max_children = 12 pm.start_servers = 2 pm.min_spare_servers = 2 pm.max_spare_servers = 10 pm.max_requests = 500 php_admin_value[memory_limit] = 128M Problem still exists. I'm running a Joomla main site (which is having no problems) and a Magento store in a sub-directory. My server is a Linux CentOS, running NGINX, APC, Memcached, Full Page Cache and php-fpm. My server has 8 cores and 16gb dedicated ram. My host has shut down my server several times the past week because my php-fpm processes are consuming the entire network. A lot of the individual php-fpm processes are getting over 50% cpu. I've hired several "professionals" and none of them was able to help me, so now broke and stumped, I'm turning to you guys for help. So any suggestions would be greatly appreciated. I turned on slow php logs and here are some of the latest results: [01-Apr-2012 14:26:12] [pool magento] pid 21537 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a394f8] _renderStraightjoin() /home/flyfish/www/flyshop/lib/Varien/Db/Select.php:397 [0x0000000011a39158] _renderStraightjoin() /home/flyfish/www/flyshop/lib/Zend/Db/Select.php:705 [0x0000000011a38f30] assemble() /home/flyfish/www/flyshop/lib/Zend/Db/Select.php:1343 [0x00007fffbb6d6e50] __toString() unknown:0 [0x0000000011a38630] _prepareQuery() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:409 [0x0000000011a38270] _prepareQuery() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:388 [0x0000000011a38008] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:734 [0x0000000011a375c8] fetchAll() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:196 [0x0000000011a370e0] _loadLabels() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:129 [0x0000000011a369a0] _afterLoad() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:536 [0x0000000011a364a8] load() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:253 [0x0000000011a35968] getConfigurableAttributes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:330 [0x0000000011a35590] getUsedProducts() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:458 [0x0000000011a35410] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1264 [0x0000000011a35098] isAvailable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1244 [0x0000000011a34fa8] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1308 [0x0000000011a33998] isSaleable() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-categoryview/rokmage-categoryview.phtml:122 [0x0000000011a331f0] +++ dump failed [01-Apr-2012 14:26:44] [pool magento] pid 21531 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a37768] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:251 [0x0000000011a37280] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:132 [0x0000000011a36b40] _afterLoad() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:536 [0x0000000011a36648] load() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:253 [0x0000000011a35b08] getConfigurableAttributes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:330 [0x0000000011a35730] getUsedProducts() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:458 [0x0000000011a355b0] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1264 [0x0000000011a35238] isAvailable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1244 [0x0000000011a35148] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1308 [0x0000000011a33b38] isSaleable() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-categoryview/rokmage-categoryview.phtml:122 [0x0000000011a33390] +++ dump failed [01-Apr-2012 14:27:01] [pool magento] pid 21528 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011ff67a8] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000011ff6518] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000011ff5e90] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000011ff5a20] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000011ff5438] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000011ff5078] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000011ff4e98] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:825 [0x0000000011ff4948] fetchOne() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Category/Flat.php:1161 [0x0000000011ff4678] getProductCount() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Category.php:801 [0x0000000011ff33e0] getProductCount() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Model/Library/Plugin/Catalog/Layer/Filter/Category.php:54 [0x0000000011ff2da0] _initItemsData() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Model/Library/Plugin/Catalog/Layer/Filter/Category.php:23 [0x0000000011ff2818] _getItemsData() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Model/Library/Plugin/Catalog/Layer/Filter/Category.php:119 [0x0000000011ff26b0] _initItems() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Layer/Filter/Abstract.php:120 [0x0000000011ff2598] getItems() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Layer/Filter/Abstract.php:109 [0x0000000011ff2480] getItemsCount() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Block/Layer/Filter/Abstract.php:126 [0x0000000011ff22b8] getItemsCount() /home/flyfish/www/flyshop/var/cache/extendware/ewcore/overrides/Mage/Catalog/Block/Layer/View/67dcc5dfa9c44bd3a205b75a08193105.php:218 [0x0000000011ff2088] canShowOptions() /home/flyfish/www/flyshop/var/cache/extendware/ewcore/overrides/Mage/Catalog/Block/Layer/View/67dcc5dfa9c44bd3a205b75a08193105.php:233 [0x0000000011ff14f8] canShowBlock() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/extendware/ewlayerednav/catalog/layer/view.phtml:6 [0x0000000011ff0d50] +++ dump failed [01-Apr-2012 14:27:04] [pool magento] pid 21529 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000012468ff8] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000012468d68] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x00000000124686e0] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000012468270] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000012467c88] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x00000000124678c8] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000012467660] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:734 [0x0000000012467248] fetchAll() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:687 [0x00000000124668f0] _fetchAll() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Entity/Collection/Abstract.php:1045 [0x0000000012466288] _loadEntities() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Entity/Collection/Abstract.php:869 [0x0000000012465fb0] load() /home/flyfish/www/flyshop/app/code/core/Mage/Review/Model/Observer.php:78 [0x0000000012465d10] catalogBlockProductCollectionBeforeToHtml() /home/flyfish/www/flyshop/app/code/core/Mage/Core/Model/App.php:1303 [0x0000000012464c28] _callObserverMethod() /home/flyfish/www/flyshop/app/code/core/Mage/Core/Model/App.php:1278 [0x00000000124649e0] dispatchEvent() /home/flyfish/www/flyshop/app/Mage.php:416 [0x0000000012464290] dispatchEvent() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Block/Product/List.php:163 [0x0000000012463760] _beforeToHtml() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:864 [0x00000000124633b0] toHtml() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:584 [0x0000000012462e30] _getChildHtml() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:528 [0x0000000012462d38] getChildHtml() /home/flyfish/www/flyshop/var/cache/extendware/ewcore/overrides/Mage/Catalog/Block/Category/View/6362e7526f5dcb27e7f8b0b414b59004.php:85 [0x00000000124629f0] getProductListHtml() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Block/Override/Mage/Catalog/Category/View.php:20 [01-Apr-2012 14:27:55] [pool magento] pid 21536 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a35010] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000011a34d80] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000011a346f8] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000011a34288] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000011a33ca0] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000011a338e0] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000011a33700] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:825 [0x0000000011a33368] fetchOne() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Type.php:71 [0x0000000011a33238] getAdditionalAttributeTable() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Attribute.php:483 [0x0000000011a32be8] getAdditionalAttributeTable() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Attribute.php:500 [0x0000000011a32860] _afterLoad() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Attribute.php:108 [0x0000000011a32330] loadByCode() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Entity/Attribute/Abstract.php:118 [0x0000000011a31350] loadByCode() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Config.php:423 [0x0000000011a30ce8] getAttribute() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Helper/Output.php:156 [0x0000000011a30208] categoryAttribute() /home/flyfish/www/flyshop/app/design/frontend/base/default/template/catalog/category/view.phtml:47 [0x0000000011a2fa60] +++ dump failed [01-Apr-2012 14:27:56] [pool magento] pid 21530 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a35b10] updateParamDefaults() /home/flyfish/www/flyshop/var/ait_rewrite/78778b0d1ad4bf93e846365bd2fbf33f.php:276 [0x0000000011a35750] updateParamDefaults() /home/flyfish/www/flyshop/var/ait_rewrite/78778b0d1ad4bf93e846365bd2fbf33f.php:326 [0x0000000011a351f0] getSkinBaseUrl() /home/flyfish/www/flyshop/var/ait_rewrite/78778b0d1ad4bf93e846365bd2fbf33f.php:482 [0x0000000011a350a8] getSkinUrl() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:981 [0x0000000011a32468] getSkinUrl() /home/flyfish/www/flyshop/app/code/local/Extendware/EWMinify/Block/Override/Mage/Page/Html/Head.php:126 [0x0000000011a30ca8] getCssJsHtml() /home/flyfish/www/flyshop/app/code/local/Extendware/EWCore/Block/Override/Mage/Page/Html/Head.php:55 [0x0000000011a30978] getCssJsHtml() /home/flyfish/www/flyshop/app/code/local/MageWorx/SeoSuite/Block/Page/Html/Head.php:41 [0x0000000011a2fd10] getCssJsHtml() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-modalheader/rokmage-head.phtml:26 [0x0000000011a2f568] +++ dump failed [01-Apr-2012 14:28:28] [pool magento] pid 21527 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000010c7bba0] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000010c7b910] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000010c7b288] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000010c7ae18] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000010c7a830] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000010c7a470] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000010c7a168] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:808 [0x0000000010c79558] fetchPairs() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Collection.php:840 [0x0000000010c79240] addCountToCategories() /home/flyfish/www/flyshop/app/code/community/Mage/Catalog/Block/Navigation.php:133 [0x0000000010c71d48] getCurrentChildCategories() /home/flyfish/www/flyshop/app/design/frontend/base/default/template/rokmagemodules/rokmage-magemenus/rokmage-magemenu-left.phtml:139 [0x0000000010c715a0] +++ dump failed [01-Apr-2012 14:28:28] [pool magento] pid 21577 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a3a8d8] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000011a3a648] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000011a39fc0] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000011a39b50] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000011a39568] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000011a391a8] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000011a38f40] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:734 [0x0000000011a37cc0] fetchAll() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Category/Flat.php:276 [0x0000000011a37b20] _loadNodes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Category/Flat.php:1229 [0x0000000011a379a0] getChildrenCategories() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Category.php:841 [0x0000000011a37690] getChildrenCategories() /home/flyfish/www/flyshop/app/code/community/Mage/Catalog/Block/Navigation.php:130 [0x0000000011a30198] getCurrentChildCategories() /home/flyfish/www/flyshop/app/design/frontend/base/default/template/rokmagemodules/rokmage-magemenus/rokmage-magemenu-left.phtml:139 [0x0000000011a2f9f0] +++ dump failed [01-Apr-2012 14:28:48] [pool magento] pid 21629 script_filename = /home/flyfish/www/flyshop/index.php [0x00002ac987e2cb48] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:252 [0x00002ac987e2c660] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:132 [0x00002ac987e2bf20] _afterLoad() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:536 [0x00002ac987e2ba28] load() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:253 [0x00002ac987e2aee8] getConfigurableAttributes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:330 [0x00002ac987e2ab10] getUsedProducts() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:458 [0x00002ac987e2a990] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1264 [0x00002ac987e2a618] isAvailable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1244 [0x00002ac987e2a528] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1308 [0x00002ac987e28f18] isSaleable() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-categoryview/rokmage-categoryview.phtml:122 [0x00002ac987e28770] +++ dump failed ___________________________________________ A snippet of the Latest php-fpm error log: [01-Apr-2012 14:26:12] WARNING: [pool magento] child 21537, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.265105 sec), logging [01-Apr-2012 14:26:12] ERROR: failed to ptrace(PEEKDATA) pid 21537: Input/output error (5) [01-Apr-2012 14:26:44] WARNING: [pool magento] child 21531, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.268434 sec), logging [01-Apr-2012 14:26:44] ERROR: failed to ptrace(PEEKDATA) pid 21531: Input/output error (5) [01-Apr-2012 14:27:01] WARNING: [pool magento] child 21528, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (6.656633 sec), logging [01-Apr-2012 14:27:01] ERROR: failed to ptrace(PEEKDATA) pid 21528: Input/output error (5) [01-Apr-2012 14:27:04] WARNING: [pool magento] child 21529, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.211136 sec), logging [01-Apr-2012 14:27:55] WARNING: [pool magento] child 21536, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.207001 sec), logging [01-Apr-2012 14:27:55] ERROR: failed to ptrace(PEEKDATA) pid 21536: Input/output error (5) [01-Apr-2012 14:27:56] WARNING: [pool magento] child 21530, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.503186 sec), logging [01-Apr-2012 14:27:56] ERROR: failed to ptrace(PEEKDATA) pid 21530: Input/output error (5) [01-Apr-2012 14:28:28] WARNING: [pool magento] child 21577, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.722625 sec), logging [01-Apr-2012 14:28:28] WARNING: [pool magento] child 21527, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.122326 sec), logging [01-Apr-2012 14:28:28] ERROR: failed to ptrace(PEEKDATA) pid 21527: Input/output error (5) [01-Apr-2012 14:28:28] ERROR: failed to ptrace(PEEKDATA) pid 21577: Input/output error (5) [01-Apr-2012 14:28:48] WARNING: [pool magento] child 21629, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.446961 sec), logging [01-Apr-2012 14:28:48] ERROR: failed to ptrace(PEEKDATA) pid 21629: Input/output error (5) _____________________________________________ I also noticed that the server is not using much memory: Mem: 16777216k total, 1204040k used, 15573176k free My.conf settings: query_cache_size = 128M innodb_buffer_pool_size = 512M open-files-limit = 8192 table_cache=4096 I just noticed that someone changed my innodb_buffer_pool_size to 512M. Shouldn't this be set to 80% of available ram? So I have 16gb ram so it should be set at 12G; however, I set it at 10G. What do you think? I made that change and restart everything. Php-fpm is still spiking cpu. Here is just 1 php-fpm process: 23942 user 17 0 507m 99m 27m R 90.9%CPU 0.6 0:03.46 php-fpm I'm sure there may be more information you will need to help, so just let me know what you guys need to help me figure this out. Thank you.

    Read the article

  • Copy one db diagram from one db to another on different servers? (Same db)

    - by sah302
    I used the copy database wizard to copy my database from our test server to our production server, the database copied everything fine except for the diagram. Okay no problem, first I make sure the target database on production has the support objects created to use database diagraming. Then I select to import data from the other database and chose the dbo.sysdiagrams.Go through with the rest of the import data wizard, but then I get the following error: Validating (Error) Messages Error 0xc0202049: Data Flow Task: Failure inserting into the read-only column "diagram_id". (SQL Server Import and Export Wizard) Error 0xc0202045: Data Flow Task: Column metadata validation failed. (SQL Server Import and Export Wizard) Error 0xc004706b: Data Flow Task: "component "Destination - sysdiagrams" (31)" failed validation and returned validation status "VS_ISBROKEN". (SQL Server Import and Export Wizard) Error 0xc004700c: Data Flow Task: One or more component failed validation. (SQL Server Import and Export Wizard) Error 0xc0024107: Data Flow Task: There were errors during task validation. (SQL Server Import and Export Wizard) So apparently it didn't like that. What's the problem? I am pretty beginner in SQL Server and only do stuff via the GUI usually so am not sure what to do at this point. The databases are the same, but on different servers. Thanks!

    Read the article

  • SQL DB design to support user feeds (in application like facebook)

    - by Yoav
    I have a social network server with a MySql DB. I want to show the users feeds like done in Facebook. Example - UserX now Friend with userY, userX did like on postX etc. Currently I have table: C1 : UserId C2 : LogType (now friend, did like etc) C3 : ObjectId (Can be userId or postId) - set depending on the LogType. Currently to get all related logs to show to the user I do the following queries: 1. Get All user Friends userIds 2. Query all rows which C1 is in userIds (I query completed) 3. Scan the DB and see - if LogType equals DidLike, check if post's OwnerId is the userId - if yes add it to logs. And so on. Obvious this is not efficient at all. I am looking for a better way. I thought I had in mind: Create a new table (in addition to the Log table) C1 : UserId C2 : LogId (from Log table) C3 : UserID of the one who did the action When querying logs - look in the table and get related Logs (by LogId) from LogTable. Updating the table: Whenever user doing action that should be in the log: 1. Add the Log entry to LogTable. 2. Scan the DB and see which users are interested with the Log (Who my friends are, Who is the owner of the post) and add related entries to the new table. (must be done in BG). 3. If user UNFRIEND another user - then look in the logs for all rows where C3 == UNFRIENDED user id and delete them. Any opinions? Other suggestions?

    Read the article

  • DB Enterprise User Security Integration With Directory Services

    - by Etienne Remillon
    Gain a better understanding of how to integrate Enterprise User Security (EUS) with various Directories by attending this 1 hour Advisor Webcast!  When: July 11, 2012 at 16:00 UK / 17:00 CET / 08:00 am Pacific / 9:00 am Mountain / 11:00 am Eastern Enterprise User Security (EUS) is a DB feature to externalize, and centrally manage DB users in a directory server. The webcast will briefly introduce EUS, followed by a detailed discussion about the various directory options that are supported, including integration with Microsoft Active Directory. We'll conclude how to avoid common pitfalls deploying EUS with directory services. TOPICS WILL INCLUDE: - Understand EUS basics - Understand EUS and directory integration options - Avoid common EUS deployment mistakes Make sure to register and mark this date on your calendar! - Details and registration.

    Read the article

  • Using a back-end mechanism to copy files to DB and notify the application

    - by BDotA
    This Scenario: User copies large files to a local folder. I want to watch that folder and when a new file is dropped then go and copy it to Database, so later when coping is done I can actually use it in my application. ( A C# WinForms App). It would be awesome to also find a way to somehow get notified in the Application that hey copying the file to DB is finished and ready for use... I am using C#.net, Windows... What solutions/architecture do you suggest for this? For example having a windows service running all the time watching that folder, when something copied goes and write it to DB ... then how about getting notified? MSMQ is something I can use? don't know much about it yet. Thanks.

    Read the article

  • Best way for saving infinit playlists (arrays) into db? (php mySql)

    - by Ole Jak
    So client gives me a string like "1,23,23,abc,ggg,544,tf4," from user 12 . There can be infinit number of elements with no spaces just value,value,... structure. I have users table (with users uId(key), names etc). I have streams table with ( sId(key), externalID, etc values). User sends me externalId's. And I need to hawe externalId's in play list (not my sId's). I need some way to store such array into my DB and be able to get it from DB. I need to be able to do 2 things return such string back to user be able to get na array from it like {1; 23; 23; abc; ggg; 544; tf4;} So what is best method (best here means shourt(small amount of) code) to store such data into db to retrivew stored tata in bouth ways shown

    Read the article

  • DB Object passing between classes singleton, static or other?

    - by Stephen
    So I'm designing a reporting system at work it's my first project written OOP and I'm stuck on the design choice for the DB class. Obviously I only want to create one instance of the DB class per-session/user and then pass it to each of the classes that need it. What I don't know it what's best practice for implementing this. Currently I have code like the following:- class db { private $user = 'USER'; private $pass = 'PASS'; private $tables = array( 'user','report', 'etc...'); function __construct(){ //SET UP CONNECTION AND TABLES } }; class report{ function __construct ($params = array(), $db, $user) { //Error checking/handling trimed //$db is the database object we created $this->db = $db; //$this->user is the user object for the logged in user $this->user = $user; $this->reportCreate(); } public function setPermission($permissionId = 1) { //Note the $this->db is this the best practise solution? $this->db->permission->find($permissionId) //Note the $this->user is this the best practise solution? $this->user->checkPermission(1) $data=array(); $this->db->reportpermission->insert($data) } };//end report I've been reading about using static classes and have just come across Singletons (though these appear to be passé already?) so what's current best practice for doing this?

    Read the article

  • ??????:???????:Oracle DB?????????????????·???

    - by Yusuke.Yamamoto
    PS???????????????????:Oracle DB?????????????????·????2???????????? ??????????????????????????????????????????????????????????? Togetter - ???????:Oracle DB?????????????????·??? ??????????????????????????·???????????????????????????? ???DBA???????????????????????????????????????? ?????????????????????(@yoshikaw?????????????!)? Oracle?????????????(Part1) - Keep It Simple, Stupid Oracle?????????????(Part2) - Keep It Simple, Stupid ??????????????????????????(????????)? Oracle Database ??????????? Oracle Ace ? ORACLE MASTER Platinum ????????????????????????????????? Oracle?????????????? - ??????????????????????? ????????????????????? oracletech.jp ????????????

    Read the article

  • Highlights from recent Yammer video

    - by Eric Jensen
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} A few weeks back, Ryan Kennedy of Yammer gave a talk about Berkeley DB Java Edition. You can find it posted here on Alex Popescu's Blog, or go directly to the video post itself. It was full of useful nuggets of information, such as why they chose to use BDB JE, performance, and some tips & tricks at the end. At over 40 minutes, the video is quite long. Ryan is an entertaining speaker, so I suggest you watch all of it. But if you only have time for the highlights, here are some times you can sync to:  06:18 hear the Berkeley DB JE features that caused Yammer select it, including: replication auto leader election, failover configurable durability and consistency guarantees 23:10 System performance characteristics 35:08 Check out the tips and tricks for using Berkeley DB JE I know the Berkeley DB development team is very pleased that BDB JE is working out well for Yammer. We definitely encourage others out there to take note of this success, especially if your requirements are similar to Yammer's (which Ryan outlines at the beginning of his talk)

    Read the article

  • How to use filegroups for DB split?

    - by Robin Jain
    In my project I have one DB used for everything. I want it to break into two databases. Static tables having look up values are to be stored in one DB and another DB would be having tables with dynamic data. My problem is that how would I use foreign key constraint in between those two DBs. Can someone help me out and suggest a way to proceed, better if I'm provided an example for the same. I thought of using synonyms for tables and then constraints on synonyms. but later I came to know that synonyms couldn't be used for constraints. I need to maintain relationships among the tables from both DB as the issue is with update, with a new release I just want to update look up tables and for the same I want to split my DB. I want to know how FileGroups could be used for this.

    Read the article

  • Database Web Service using Toplink DB Provider

    - by Vishal Jain
    With JDeveloper 11gR2 you can now create database based web services using JAX-WS Provider. The key differences between this and the already existing PL/SQL Web Services support is:Based on JAX-WS ProviderSupports SQL Queries for creating Web ServicesSupports Table CRUD OperationsThis is present as a new option in the New Gallery under 'Web Services'When you invoke the New Gallery option, it present you with three options to choose from:In this entry I will explain the options of creating service based on SQL queries and Table CRUD operations.SQL Query based Service When you select this option, on 'Next' page it asks you for the DB Conn details. You can also choose if you want SOAP 1.1 or 1.2 format. For this example, I will proceed with SOAP 1.1, the default option.On the Next page, you can give the SQL query. The wizard support Bind Variables, so you can parametrize your queries. Give "?" as a input parameter you want to give at runtime, and the "Bind Variables" button will get enabled. Here you can specify the name and type of the variable.Finish the wizard. Now you can test your service in Analyzer:See that the bind variable specified comes as a input parameter in the Analyzer Input Form:CRUD OperationsFor this, At Step 2 of Wizard, select the radio button "Generate Table CRUD Service Provider"At the next step, select the DB Connection and the table for which you want to generate the default set of operations:Finish the Wizard. Now, run the service in Analyzer for a quick check.See that all the basic operations are exposed:

    Read the article

  • Oracle DB, Oracle ADF, GlassFish, JDeveloper, NetBeans IDE

    - by Geertjan
    Today I started some experiments with Oracle guru Steven Davelaar, who lives about 20 minutes away from my place in Amsterdam by underground. Very convenient. He showed me a bunch of things in JDeveloper, while I showed him a bunch of things in NetBeans IDE. He managed to deploy an ADF application to GlassFish in JDeveloper. And, so far, I failed to do the same thing in NetBeans IDE. Quite a few (around 100) JARs are needed, aside from the question of correctly setting up or importing an ADF application, and we're still figuring out which and who and when and where. And how. And if. And why. Nonetheless, I did manage to get Oracle DB set up in NetBeans IDE, after downloading it from here: http://www.oracle.com/technetwork/products/express-edition/downloads/index.html Here's what it looks like when registered in NetBeans IDE, i.e., notice that I have a cool sample database available:   Data from the above database I managed to display very easily via the various NetBeans code generators in a PrimeFaces application, exactly as has been done many times in demonstrations and tutorials everywhere, i.e., generate JPA entities, then create an EJB, then inject the EJB into a PrimeFaces data table: The next step is to somehow do the same with ADF in NetBeans IDE. I had some trouble with passwords for Oracle DB, the command line (with Steven's help) proved helpful: Wish us luck as we continue our ADF-inspired journey. This blog entry by Shay is also relevant: Deploying Oracle ADF Essentials Applications to Glassfish

    Read the article

  • Call DB Stored Procedure using @NamedStoredProcedureQuery Injection

    - by anwilson
    Oracle Database Stored Procedure can be called from EJB business layer to perform complex DB specific operations. This approach will avoid overhead from frequent network hits which could impact end-user result. DB Stored Procedure can be invoked from EJB Session Bean business logic using org.eclipse.persistence.queries.StoredProcedureCall API. Using this approach requires more coding to handle the Session and Arguments of the Stored Procedure, thereby increasing effort on maintenance. EJB 3.0 introduces @NamedStoredProcedureQuery Injection to call Database Stored Procedure as NamedQueries. This blog will take you through the steps to call Oracle Database Stored Procedure using @NamedStoredProcedureQuery.EMP_SAL_INCREMENT procedure available in HR schema will be used in this sample.Create Entity from EMPLOYEES table.Add @NamedStoredProcedureQuery above @NamedQueries to Employees.java with definition as given below - @NamedStoredProcedureQuery(name="Employees.increaseEmpSal", procedureName = "EMP_SAL_INCREMENT", resultClass=void.class, resultSetMapping = "", returnsResultSet = false, parameters = { @StoredProcedureParameter(name = "EMP_ID", queryParameter = "EMPID"), @StoredProcedureParameter(name = "SAL_INCR", queryParameter = "SALINCR")} ) Observe how Stored Procedure's arguments are handled easily in  @NamedStoredProcedureQuery using @StoredProcedureParameter.Expose Entity Bean by creating a Session Facade.Business method need to be added to Session Bean to access the Stored Procedure exposed as NamedQuery. public void salaryRaise(Long empId, Long salIncrease) throws Exception { try{ Query query = em.createNamedQuery("Employees.increaseEmpSal"); query.setParameter("EMPID", empId); query.setParameter("SALINCR", salIncrease); query.executeUpdate(); } catch(Exception ex){ throw ex; } } Expose business method through Session Bean Remote Interface. void salaryRaise(Long empId, Long salIncrease) throws Exception; Session Bean Client is required to invoke the method exposed through remote interface.Call exposed method in Session Bean Client main method. final Context context = getInitialContext(); SessionEJB sessionEJB = (SessionEJB)context.lookup("Your-JNDI-lookup"); sessionEJB.salaryRaise(new Long(200), new Long(1000)); Deploy Session BeanRun Session Bean Client.Salary of Employee with Id 200 will be increased by 1000.

    Read the article

  • rake db:create not working for legacy rails app (2.3.5) using MySQL (5.5.28)

    - by ridicter
    I'm a new Rails Developer, and I'm working on a legacy Rails app. Whenever I run the rake db:create command, I get an error that the database couldn't be created. I have found many StackOverflow questions related to this, but in troubleshooting nearly all permutations of solutions, I couldn't resolve the issue. I created the three Dbs (dev, prod, test), created the user with all access privileges to these dbs, and ran rake db:create. I'm running Mac OS X Lion, MySQL 5.5.28, Rails 2.3.5, Ruby 1.8.7. Here are my settings development: adapter: mysql encoding: utf8 database: adva_development username: adva password: **** host: localhost socket: /tmp/mysql.sock Here's the error: Couldn't create database for {"adapter"=>"mysql", "username"=>"adva", "host"=>"localhost", "encoding"=>"utf8", "database"=>"adva_development", "socket"=>"/tmp/mysql.sock", "password"=>"****"}, charset: utf8, collation: utf8_unicode_ci (if you set the charset manually, make sure you have a matching collation) I have done the following troubleshooting: Verified user and password are correct, and the user has access to the DB. (Double checked user access with SELECT * FROM mysql.db WHERE Db = 'adva_development' \G; User has all privileges.) Verify the socket is correct. I don't really understand sockets, but I can plainly see it at /tmp/mysql.sock. Checked collation and character set. I found out I had created the DB in latin charset and collation, so I recreated them. I ran show variables like "collation_database"; and show variables like "character_set_database"; and came back with utf8 and utf8_unicode_ci respectively. I followed the instructions in this question. After uninstalling mysql gem, I ran the following but came up with the same error: gem install --no-rdoc --no-ri mysql -- --with-mysql-dir=/usr/local/mysql-5.5.28-osx10.6-x86_64/bin --with-mysql-config=/usr/local/mysql-5.5.28-osx10.6-x86_64/bin/mysql_config Following Matt's suggestion, here's what a rake --trace db:create reveals: ** Invoke db:create (first_time) ** Invoke db:load_config (first_time) ** Invoke rails_env (first_time) ** Execute rails_env ** Execute db:load_config ** Execute db:create Couldn't create database for {"database"=>"adva_development", "adapter"=>"mysql", "host"=>"127.0.0.1", "password"=>"woof2adva", "username"=>"adva", "encoding"=>"utf8"}, charset: utf8, collation: utf8_unicode_ci (if you set the charset manually, make sure you have a matching collation) After 3 days and six or seven hours, I have pretty much run out of options. I tried various random things, like replacing localhost with 127.0.0.1 to no avail. Could there be something wrong related to my specific environment? Mac OS X Lion + MySQL 5.5.28? I plan on trying on setting up everything in a Linux environment. Thanks!

    Read the article

  • Why each request with activerecord is doing SHOW TABLE?

    - by dynback.com
    My test page is processed if believe to trace in 46 ms, while 11 of them I am doing this 20:53:06.111597 system.db.CDbConnection Opening DB connection 20:53:06.118046 system.db.CDbCommand Querying SQL: SHOW COLUMNS FROM `questions` 20:53:06.122476 system.db.CDbCommand Querying SQL: SHOW CREATE TABLE `questions` Is this obligatory?

    Read the article

  • SQL SERVER – Example of Performance Tuning for Advanced Users with DB Optimizer

    - by Pinal Dave
    Performance tuning is such a subject that everyone wants to master it. In beginning everybody is at a novice level and spend lots of time learning how to master the art of performance tuning. However, as we progress further the tuning of the system keeps on getting very difficult. I have understood in my early career there should be no need of ego in the technology field. There are always better solutions and better ideas out there and we should not resist them. Instead of resisting the change and new wave I personally adopt it. Here is a similar example, as I personally progress to the master level of performance tuning, I face that it is getting harder to come up with optimal solutions. In such scenarios I rely on various tools to teach me how I can do things better. Once I learn about tools, I am often able to come up with better solutions when I face the similar situation next time. A few days ago I had received a query where the user wanted to tune it further to get the maximum out of the performance. I have re-written the similar query with the help of AdventureWorks sample database. SELECT * FROM HumanResources.Employee e INNER JOIN HumanResources.EmployeeDepartmentHistory edh ON e.BusinessEntityID = edh.BusinessEntityID INNER JOIN HumanResources.Shift s ON edh.ShiftID = s.ShiftID; User had similar query to above query was used in very critical report and wanted to get best out of the query. When I looked at the query – here were my initial thoughts Use only column in the select statements as much as you want in the application Let us look at the query pattern and data workload and find out the optimal index for it Before I give further solutions I was told by the user that they need all the columns from all the tables and creating index was not allowed in their system. He can only re-write queries or use hints to further tune this query. Now I was in the constraint box – I believe * was not a great idea but if they wanted all the columns, I believe we can’t do much besides using *. Additionally, if I cannot create a further index, I must come up with some creative way to write this query. I personally do not like to use hints in my application but there are cases when hints work out magically and gives optimal solutions. Finally, I decided to use Embarcadero’s DB Optimizer. It is a fantastic tool and very helpful when it is about performance tuning. I have previously explained how it works over here. First open DBOptimizer and open Tuning Job from File >> New >> Tuning Job. Once you open DBOptimizer Tuning Job follow the various steps indicates in the following diagram. Essentially we will take our original script and will paste that into Step 1: New SQL Text and right after that we will enable Step 2 for Generating Various cases, Step 3 for Detailed Analysis and Step 4 for Executing each generated case. Finally we will click on Analysis in Step 5 which will generate the report detailed analysis in the result pan. The detailed pan looks like. It generates various cases of T-SQL based on the original query. It applies various hints and available hints to the query and generate various execution plans of the query and displays them in the resultant. You can clearly notice that original query had a cost of 0.0841 and logical reads about 607 pages. Whereas various options which are just following it has different execution cost as well logical read. There are few cases where we have higher logical read and there are few cases where as we have very low logical read. If we pay attention the very next row to original query have Merge_Join_Query in description and have lowest execution cost value of 0.044 and have lowest Logical Reads of 29. This row contains the query which is the most optimal re-write of the original query. Let us double click over it. Here is the query: SELECT * FROM HumanResources.Employee e INNER JOIN HumanResources.EmployeeDepartmentHistory edh ON e.BusinessEntityID = edh.BusinessEntityID INNER JOIN HumanResources.Shift s ON edh.ShiftID = s.ShiftID OPTION (MERGE JOIN) If you notice above query have additional hint of Merge Join. With the help of this Merge Join query hint this query is now performing much better than before. The entire process takes less than 60 seconds. Please note that it the join hint Merge Join was optimal for this query but it is not necessary that the same hint will be helpful in all the queries. Additionally, if the workload or data pattern changes the query hint of merge join may be no more optimal join. In that case, we will have to redo the entire exercise once again. This is the reason I do not like to use hints in my queries and I discourage all of my users to use the same. However, if you look at this example, this is a great case where hints are optimizing the performance of the query. It is humanly not possible to test out various query hints and index options with the query to figure out which is the most optimal solution. Sometimes, we need to depend on the efficiency tools like DB Optimizer to guide us the way and select the best option from the suggestion provided. Let me know what you think of this article as well your experience with DB Optimizer. Please leave a comment. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Joins, SQL Optimization, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Dump Trac DB on Windows/XAMPP

    - by Whiteknight
    I have a Trac instance running on a WindowsXP machine with XAMPP. I am trying to migrate the trac instance to a newer Linux-based machine. However, I'm having a hard time getting the database to cooperate. I try to dump the db with this command: sqlite3 C:\tracroot\db\trac.db ".dump" >> mysqldump.sql But the generated file is mostly empty: BEGIN TRANSACTION; COMMIT; So that's not right. For the record my trac instance is running now and appears to have full access to all the contents of the DB. But sqlite3 (located in C:\xampp\apache\bin) can't seem to get any information from the file. The DB file itself has the header "SQLite format 3", so that seems to be correct. I need to know one of two things: How to get this dump working OR An alternate way to migrate the Trac database to the new machine. Update: When I try to open the .db file in sqlite3, I get the error Error: unsupported file format. What format is it in, and why is it unsupported?

    Read the article

  • Synchronizing an ERWin model with a Visual Studio 2008 GDR 2/2010 db project

    - by Grant Back
    I am looking for options to get our vast collection of DB objects across many DBs into source control (TFS 2010). Once we succeed here, we will work toward generating our alter scripts for a particular DB change via TFS build. The problem is, our data architecture group is responsible for maintaining the DB objects (excluding SPs), and they work within a model centric process, via ERWin. What this means, is that they maintain the DBs via ERWin models, and generate alters from them that are used to release changes. In order to achieve our goal of getting the DB objects (not just the ERWin models) into TFS, I believe the best option is to do this via Visual Studio DB projects. From what I can tell, there is very little urgency for CA to continue supporting an integration between ERWin and Visual Studio, that no longer works as of Visual Studio 2008 DB Ed. GDR. If I have been mislead in this regard, please feel free to set me straight. One potential solution is to: Perform changes in the ERWin model. Take the alter script generated from ERWin, and import the script into the appropriate Visual Studio DB project, updating the objects in the in the DB project Check the changed objects in the DB project into TFS. TFS Build executes to generate the alter scripts that will be used to push the changes through our release process. My question is, is this solution viable, or are there any other options?

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >