Search Results

Search found 17964 results on 719 pages for 'product features'.

Page 56/719 | < Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >

  • is NATURAL JOIN any better than SELECT FROM WHERE in terms of performance ?

    - by ashy_32bit
    Today I got into a debate with my project manager about Cartesian products. He says a 'natural join' is somehow much better than using 'select from where' because the later cause the db engine to internally perform a Cartesian product but the former uses another approach that prevents this. As far as I know, the natural join syntax is not any different in anyway than 'select from where' in terms of performance or meaning, I mean you can use either based on your taste. SELECT * FROM table1,table2 WHERE table1.id=table2.id SELECT * FROM table1 NATURAL JOIN table2 please elaborate about the first query causing a Cartesian product but the second one being somehow more smart

    Read the article

  • Write a program for a report derived from the data in the data file JEWELRY. The data is to be input

    - by Taylor
    here is the JEWELRY file 0011 Money_Clip 2.000 50.00 Other 0035 Paperweight 1.625 175.00 Other 0457 Cuff_Bracelet 2.375 150.00 Bracelet 0465 Links_Bracelet 7.125 425.00 Bracelet 0585 Key_Chain 1.325 50.00 Other 0595 Cuff_Links 0.625 525.00 Other 0935 Royale_Pendant 0.625 975.00 Pendant 1092 Bordeaux_Cross 1.625 425.00 Cross 1105 Victory_Medallion 0.875 30.00 Pendant 1111 Marquis_Cross 1.375 70.00 Cross 1160 Christina_Ring 0.500 175.00 Ring 1511 French_Clips 0.687 375.00 Other 1717 Pebble_Pendant 1.250 45.00 Pendant 1725 Folded_Pendant 1.250 45.00 Pendant 1730 Curio_Pendant 1.063 275.00 Pendant this is the program i have used #include <iostream> #include <string> #include <iomanip> #include <fstream> using namespace std; struct productJewelry { string name; double amount; int itemCode; double size; string group; }; int main() { // declare variables ifstream inFile; int count=0; int x=0; productJewelry product[50]; inFile.open("jewelry.txt"); // file must be in same folder if (inFile.fail()) cout << "failed"; cout << fixed << showpoint; // fixed format, two decimal places cout << setprecision(2); while (inFile.peek() != EOF) { // cout << count << " : "; count++; inFile>> product[x].itemCode; inFile>> product[x].name; inFile>> product[x].size; inFile>> product[x].amount; inFile>> product[x].group; // cout << product[x].itemCode << ", " << product[x].name << ", "<< product[x].size << ", " << product[x].amount << endl; x++; if (inFile.peek() == '\n') inFile.ignore(1, '\n'); } inFile.close(); string temp; bool swap; do { swap = false; for (int x=0; x<count;x++) { if (product[x].name>product[x+1].name) { //these 3 lines are to swap elements in array temp=product[x].name; product[x].name=product[x+1].name; product[x+1].name=temp; swap=true; } } } while (swap); for (x=0; x< count; x++) { //cout<< product[x].itemCode<<" "; //cout<< product[x].name <<" "; //cout<< product[x].size <<" "; //cout<< product[x].amount<<" "; //cout<< product[x].group<<" "<<endl; } system("pause"); // to freeze Dev-c++ output screen return 0; } // end main

    Read the article

  • ZF2 namespace and file system

    - by user1918648
    I have standard Application module in ZF2. It's configured by default, I didn't change anything. I just added some stuff: module/ Application/ src/ Application/ Entity/ Product/ **Product.php** Controller/ **IndexController.php** Product.php namespace Application\Entity; class Product { } IndexController.php namespace Application\Controller; use Zend\Mvc\Controller\AbstractActionController; use Zend\View\Model\ViewModel; use Application\Entity\Product; class IndexController extends AbstractActionController { public function indexAction() { $product = new Product(); } } and I get following error: Fatal error: Class 'Application\Entity\Product' not found in \module\Application\src\Application\Controller\IndexController.php on line 20 I use the same namespace, but it doesn't see it. Why? P.S: If I will change Product.php to be the following: namespace Application\Entity\Product; class Product { } then in the IndexController.php the following code will be working: namespace Application\Controller; use Zend\Mvc\Controller\AbstractActionController; use Zend\View\Model\ViewModel; use Application\Entity\Product\Product; class IndexController extends AbstractActionController { public function indexAction() { $product = new Product(); } }

    Read the article

  • Company wants to write custom project management tool, rather then use third party product.

    - by Jason Evans
    At the company I work, we are really wanting to get into the agile methodology for developing software. One thing that I'm not excited about is the fact that management wants us to build a custom project management feature inside the company's Intranet. I think this is a total waste of time. There are many great third party tools available (e.g. Axosoft OnTime) that can do everything we need, and more. For how much development time it would cost us to build our own project management module, we could buy numerous licences for a third party product. One concern is that, whilst we are writing code for a client, and using our custom Intranet project management module, we find bugs in the module that need fixing ASAP. That means having to stop work on the client code to fix the Intranet. That just puts shivers down my spine. Another worry I have is lack of functionality. This custom module is going to be so basic, that it will just feel really crap to use. That might sound a bit snooty, but for goodness sake, many third party tools are so feature rich, that the idea of having to write our own tool makes feel very uneasy. In fact, I can't be bothered. What do you guys think? I'm going to raise this issue with my boss, since I feel it's such an important topic to talk about. EDIT: Thanks for the great responses, much appreciated. To summarize some of them: Money Naturally my boss does want to save money, by not forking out a few hundred £'s for licences. However, for us to write a custom tool, it will take x number of days, multiplied by approx £500, which is our costs. I don't see the business value in this. Management have mentioned that they want to sell the Intranet as a product in the future, but it's so custom to our needs (and downright basic), that in order to give it to another client, I can see us having to fork a version of the code and rebuild the majority of it anyway. So it's not like we're gaining anything there in reuse. Features Having our own custom module means not feature bloat - only the functionality we require will be in the product. My issue is that there are plenty of free, open-source project management tools out there with minimal features already. So even if cost is an issue, we could look into open-source. Again it all boils down to the fact that I don't see the point in writing a project management tool in this day and age. It's a bit like writing your own web browser - why?, what's the point? Although management are asking for this tool, just because they are, it does not mean I'm going to please them and do it just because they asked for it. If something does not make sense, then I will raise it as a concern. At the end of the day, it's the developers who write the code, it's the developers who make money for a business. Thus, as far I'm concerned, the devs have a very big role in deciding how a company should manage projects and what tools are used. "I am Spartan, argh!" :) Hmm, I've not been able to make this question a wiki for some reason, thus I'm going to have to pick an answer to accept. Cheers. Jas.

    Read the article

  • Magento, NGINX, PHP-FPM, APC, MEMCACHED, 16gb Ram CentOS, Spiking PHP-FPM to 100% CPU

    - by Terry Dunford
    I have been trying to resolve my issue of spiking cpu caused by php-fpm processes. I've reduced the php-fpm config settings to: pm = ondemand pm.max_children = 12 pm.start_servers = 2 pm.min_spare_servers = 2 pm.max_spare_servers = 10 pm.max_requests = 500 php_admin_value[memory_limit] = 128M Problem still exists. I'm running a Joomla main site (which is having no problems) and a Magento store in a sub-directory. My server is a Linux CentOS, running NGINX, APC, Memcached, Full Page Cache and php-fpm. My server has 8 cores and 16gb dedicated ram. My host has shut down my server several times the past week because my php-fpm processes are consuming the entire network. A lot of the individual php-fpm processes are getting over 50% cpu. I've hired several "professionals" and none of them was able to help me, so now broke and stumped, I'm turning to you guys for help. So any suggestions would be greatly appreciated. I turned on slow php logs and here are some of the latest results: [01-Apr-2012 14:26:12] [pool magento] pid 21537 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a394f8] _renderStraightjoin() /home/flyfish/www/flyshop/lib/Varien/Db/Select.php:397 [0x0000000011a39158] _renderStraightjoin() /home/flyfish/www/flyshop/lib/Zend/Db/Select.php:705 [0x0000000011a38f30] assemble() /home/flyfish/www/flyshop/lib/Zend/Db/Select.php:1343 [0x00007fffbb6d6e50] __toString() unknown:0 [0x0000000011a38630] _prepareQuery() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:409 [0x0000000011a38270] _prepareQuery() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:388 [0x0000000011a38008] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:734 [0x0000000011a375c8] fetchAll() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:196 [0x0000000011a370e0] _loadLabels() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:129 [0x0000000011a369a0] _afterLoad() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:536 [0x0000000011a364a8] load() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:253 [0x0000000011a35968] getConfigurableAttributes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:330 [0x0000000011a35590] getUsedProducts() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:458 [0x0000000011a35410] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1264 [0x0000000011a35098] isAvailable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1244 [0x0000000011a34fa8] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1308 [0x0000000011a33998] isSaleable() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-categoryview/rokmage-categoryview.phtml:122 [0x0000000011a331f0] +++ dump failed [01-Apr-2012 14:26:44] [pool magento] pid 21531 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a37768] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:251 [0x0000000011a37280] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:132 [0x0000000011a36b40] _afterLoad() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:536 [0x0000000011a36648] load() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:253 [0x0000000011a35b08] getConfigurableAttributes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:330 [0x0000000011a35730] getUsedProducts() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:458 [0x0000000011a355b0] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1264 [0x0000000011a35238] isAvailable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1244 [0x0000000011a35148] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1308 [0x0000000011a33b38] isSaleable() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-categoryview/rokmage-categoryview.phtml:122 [0x0000000011a33390] +++ dump failed [01-Apr-2012 14:27:01] [pool magento] pid 21528 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011ff67a8] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000011ff6518] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000011ff5e90] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000011ff5a20] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000011ff5438] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000011ff5078] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000011ff4e98] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:825 [0x0000000011ff4948] fetchOne() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Category/Flat.php:1161 [0x0000000011ff4678] getProductCount() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Category.php:801 [0x0000000011ff33e0] getProductCount() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Model/Library/Plugin/Catalog/Layer/Filter/Category.php:54 [0x0000000011ff2da0] _initItemsData() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Model/Library/Plugin/Catalog/Layer/Filter/Category.php:23 [0x0000000011ff2818] _getItemsData() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Model/Library/Plugin/Catalog/Layer/Filter/Category.php:119 [0x0000000011ff26b0] _initItems() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Layer/Filter/Abstract.php:120 [0x0000000011ff2598] getItems() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Layer/Filter/Abstract.php:109 [0x0000000011ff2480] getItemsCount() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Block/Layer/Filter/Abstract.php:126 [0x0000000011ff22b8] getItemsCount() /home/flyfish/www/flyshop/var/cache/extendware/ewcore/overrides/Mage/Catalog/Block/Layer/View/67dcc5dfa9c44bd3a205b75a08193105.php:218 [0x0000000011ff2088] canShowOptions() /home/flyfish/www/flyshop/var/cache/extendware/ewcore/overrides/Mage/Catalog/Block/Layer/View/67dcc5dfa9c44bd3a205b75a08193105.php:233 [0x0000000011ff14f8] canShowBlock() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/extendware/ewlayerednav/catalog/layer/view.phtml:6 [0x0000000011ff0d50] +++ dump failed [01-Apr-2012 14:27:04] [pool magento] pid 21529 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000012468ff8] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000012468d68] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x00000000124686e0] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000012468270] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000012467c88] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x00000000124678c8] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000012467660] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:734 [0x0000000012467248] fetchAll() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:687 [0x00000000124668f0] _fetchAll() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Entity/Collection/Abstract.php:1045 [0x0000000012466288] _loadEntities() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Entity/Collection/Abstract.php:869 [0x0000000012465fb0] load() /home/flyfish/www/flyshop/app/code/core/Mage/Review/Model/Observer.php:78 [0x0000000012465d10] catalogBlockProductCollectionBeforeToHtml() /home/flyfish/www/flyshop/app/code/core/Mage/Core/Model/App.php:1303 [0x0000000012464c28] _callObserverMethod() /home/flyfish/www/flyshop/app/code/core/Mage/Core/Model/App.php:1278 [0x00000000124649e0] dispatchEvent() /home/flyfish/www/flyshop/app/Mage.php:416 [0x0000000012464290] dispatchEvent() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Block/Product/List.php:163 [0x0000000012463760] _beforeToHtml() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:864 [0x00000000124633b0] toHtml() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:584 [0x0000000012462e30] _getChildHtml() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:528 [0x0000000012462d38] getChildHtml() /home/flyfish/www/flyshop/var/cache/extendware/ewcore/overrides/Mage/Catalog/Block/Category/View/6362e7526f5dcb27e7f8b0b414b59004.php:85 [0x00000000124629f0] getProductListHtml() /home/flyfish/www/flyshop/app/code/local/Extendware/EWLayeredNav/Block/Override/Mage/Catalog/Category/View.php:20 [01-Apr-2012 14:27:55] [pool magento] pid 21536 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a35010] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000011a34d80] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000011a346f8] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000011a34288] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000011a33ca0] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000011a338e0] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000011a33700] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:825 [0x0000000011a33368] fetchOne() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Type.php:71 [0x0000000011a33238] getAdditionalAttributeTable() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Attribute.php:483 [0x0000000011a32be8] getAdditionalAttributeTable() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Attribute.php:500 [0x0000000011a32860] _afterLoad() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Resource/Entity/Attribute.php:108 [0x0000000011a32330] loadByCode() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Entity/Attribute/Abstract.php:118 [0x0000000011a31350] loadByCode() /home/flyfish/www/flyshop/app/code/core/Mage/Eav/Model/Config.php:423 [0x0000000011a30ce8] getAttribute() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Helper/Output.php:156 [0x0000000011a30208] categoryAttribute() /home/flyfish/www/flyshop/app/design/frontend/base/default/template/catalog/category/view.phtml:47 [0x0000000011a2fa60] +++ dump failed [01-Apr-2012 14:27:56] [pool magento] pid 21530 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a35b10] updateParamDefaults() /home/flyfish/www/flyshop/var/ait_rewrite/78778b0d1ad4bf93e846365bd2fbf33f.php:276 [0x0000000011a35750] updateParamDefaults() /home/flyfish/www/flyshop/var/ait_rewrite/78778b0d1ad4bf93e846365bd2fbf33f.php:326 [0x0000000011a351f0] getSkinBaseUrl() /home/flyfish/www/flyshop/var/ait_rewrite/78778b0d1ad4bf93e846365bd2fbf33f.php:482 [0x0000000011a350a8] getSkinUrl() /home/flyfish/www/flyshop/var/ait_rewrite/6bfe16ca572eea47db567910902c6209.php:981 [0x0000000011a32468] getSkinUrl() /home/flyfish/www/flyshop/app/code/local/Extendware/EWMinify/Block/Override/Mage/Page/Html/Head.php:126 [0x0000000011a30ca8] getCssJsHtml() /home/flyfish/www/flyshop/app/code/local/Extendware/EWCore/Block/Override/Mage/Page/Html/Head.php:55 [0x0000000011a30978] getCssJsHtml() /home/flyfish/www/flyshop/app/code/local/MageWorx/SeoSuite/Block/Page/Html/Head.php:41 [0x0000000011a2fd10] getCssJsHtml() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-modalheader/rokmage-head.phtml:26 [0x0000000011a2f568] +++ dump failed [01-Apr-2012 14:28:28] [pool magento] pid 21527 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000010c7bba0] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000010c7b910] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000010c7b288] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000010c7ae18] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000010c7a830] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000010c7a470] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000010c7a168] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:808 [0x0000000010c79558] fetchPairs() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Collection.php:840 [0x0000000010c79240] addCountToCategories() /home/flyfish/www/flyshop/app/code/community/Mage/Catalog/Block/Navigation.php:133 [0x0000000010c71d48] getCurrentChildCategories() /home/flyfish/www/flyshop/app/design/frontend/base/default/template/rokmagemodules/rokmage-magemenus/rokmage-magemenu-left.phtml:139 [0x0000000010c715a0] +++ dump failed [01-Apr-2012 14:28:28] [pool magento] pid 21577 script_filename = /home/flyfish/www/flyshop/index.php [0x0000000011a3a8d8] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement/Pdo.php:228 [0x0000000011a3a648] _execute() /home/flyfish/www/flyshop/lib/Varien/Db/Statement/Pdo/Mysql.php:110 [0x0000000011a39fc0] _execute() /home/flyfish/www/flyshop/lib/Zend/Db/Statement.php:300 [0x0000000011a39b50] execute() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:479 [0x0000000011a39568] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Pdo/Abstract.php:238 [0x0000000011a391a8] query() /home/flyfish/www/flyshop/lib/Varien/Db/Adapter/Pdo/Mysql.php:389 [0x0000000011a38f40] query() /home/flyfish/www/flyshop/lib/Zend/Db/Adapter/Abstract.php:734 [0x0000000011a37cc0] fetchAll() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Category/Flat.php:276 [0x0000000011a37b20] _loadNodes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Category/Flat.php:1229 [0x0000000011a379a0] getChildrenCategories() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Category.php:841 [0x0000000011a37690] getChildrenCategories() /home/flyfish/www/flyshop/app/code/community/Mage/Catalog/Block/Navigation.php:130 [0x0000000011a30198] getCurrentChildCategories() /home/flyfish/www/flyshop/app/design/frontend/base/default/template/rokmagemodules/rokmage-magemenus/rokmage-magemenu-left.phtml:139 [0x0000000011a2f9f0] +++ dump failed [01-Apr-2012 14:28:48] [pool magento] pid 21629 script_filename = /home/flyfish/www/flyshop/index.php [0x00002ac987e2cb48] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:252 [0x00002ac987e2c660] _loadPrices() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php:132 [0x00002ac987e2bf20] _afterLoad() /home/flyfish/www/flyshop/lib/Varien/Data/Collection/Db.php:536 [0x00002ac987e2ba28] load() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:253 [0x00002ac987e2aee8] getConfigurableAttributes() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:330 [0x00002ac987e2ab10] getUsedProducts() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product/Type/Configurable.php:458 [0x00002ac987e2a990] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1264 [0x00002ac987e2a618] isAvailable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1244 [0x00002ac987e2a528] isSalable() /home/flyfish/www/flyshop/app/code/core/Mage/Catalog/Model/Product.php:1308 [0x00002ac987e28f18] isSaleable() /home/flyfish/www/flyshop/app/design/frontend/moxy/default/template/rokmagemodules/rokmage-categoryview/rokmage-categoryview.phtml:122 [0x00002ac987e28770] +++ dump failed ___________________________________________ A snippet of the Latest php-fpm error log: [01-Apr-2012 14:26:12] WARNING: [pool magento] child 21537, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.265105 sec), logging [01-Apr-2012 14:26:12] ERROR: failed to ptrace(PEEKDATA) pid 21537: Input/output error (5) [01-Apr-2012 14:26:44] WARNING: [pool magento] child 21531, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.268434 sec), logging [01-Apr-2012 14:26:44] ERROR: failed to ptrace(PEEKDATA) pid 21531: Input/output error (5) [01-Apr-2012 14:27:01] WARNING: [pool magento] child 21528, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (6.656633 sec), logging [01-Apr-2012 14:27:01] ERROR: failed to ptrace(PEEKDATA) pid 21528: Input/output error (5) [01-Apr-2012 14:27:04] WARNING: [pool magento] child 21529, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.211136 sec), logging [01-Apr-2012 14:27:55] WARNING: [pool magento] child 21536, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.207001 sec), logging [01-Apr-2012 14:27:55] ERROR: failed to ptrace(PEEKDATA) pid 21536: Input/output error (5) [01-Apr-2012 14:27:56] WARNING: [pool magento] child 21530, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.503186 sec), logging [01-Apr-2012 14:27:56] ERROR: failed to ptrace(PEEKDATA) pid 21530: Input/output error (5) [01-Apr-2012 14:28:28] WARNING: [pool magento] child 21577, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.722625 sec), logging [01-Apr-2012 14:28:28] WARNING: [pool magento] child 21527, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.122326 sec), logging [01-Apr-2012 14:28:28] ERROR: failed to ptrace(PEEKDATA) pid 21527: Input/output error (5) [01-Apr-2012 14:28:28] ERROR: failed to ptrace(PEEKDATA) pid 21577: Input/output error (5) [01-Apr-2012 14:28:48] WARNING: [pool magento] child 21629, script '/home/flyfish/www/flyshop/index.php' (request: "GET /flyshop/index.php") executing too slow (5.446961 sec), logging [01-Apr-2012 14:28:48] ERROR: failed to ptrace(PEEKDATA) pid 21629: Input/output error (5) _____________________________________________ I also noticed that the server is not using much memory: Mem: 16777216k total, 1204040k used, 15573176k free My.conf settings: query_cache_size = 128M innodb_buffer_pool_size = 512M open-files-limit = 8192 table_cache=4096 I just noticed that someone changed my innodb_buffer_pool_size to 512M. Shouldn't this be set to 80% of available ram? So I have 16gb ram so it should be set at 12G; however, I set it at 10G. What do you think? I made that change and restart everything. Php-fpm is still spiking cpu. Here is just 1 php-fpm process: 23942 user 17 0 507m 99m 27m R 90.9%CPU 0.6 0:03.46 php-fpm I'm sure there may be more information you will need to help, so just let me know what you guys need to help me figure this out. Thank you.

    Read the article

  • How to Programmatically Split and Manipulate Rows of Data From Excel

    - by Charlene
    I am hoping one of you will be able to help get me started on this issue. I need to create some sort of macro or VBA code to split and manipulate rows of data in Excel. For this example, we have 5 rows of data. The first 3 rows are item information for Order # 0000000000-00 and the last 2 rows are item information for order # 0000000000-01. I need one row ("HDR") for each order number, and one row ("ITM") for each product per order. I have included an example below showing the data I will receive and the desired outcome. Raw Data: order-id product-num date buyer-name product-name quantity-purchased 0000000000-00 10000000000000 5/29/2014 John Doe Product 0 1 0000000000-00 10000000000001 5/29/2014 John Doe Product 1 2 0000000000-00 10000000000002 5/29/2014 John Doe Product 2 1 0000000000-01 10000000000002 5/30/2014 Jane Doe Product 2 1 0000000000-01 10000000000003 5/30/2014 Jane Doe Product 3 1 Desired Outcome: HDR 0000000000-00 John Doe 5/29/2014 ITM 10000000000000 Product 0 1 ITM 10000000000001 Product 1 2 ITM 10000000000002 Product 2 1 HDR 0000000000-01 Jane Doe 5/30/2014 ITM 10000000000002 Product 2 1 ITM 10000000000003 Product 3 1 Any and all help would be much appreciated!!! Thank you.

    Read the article

  • SharePoint Item Event Receivers and Site Creation

    - by Michael Edwards
    I have created an Item Event Receiver for a document library and I have test that the logic works correctly and it all does. The next thing I wanted to do is automatically create the list when a site is created so I added the list to the ONET.xml file for the site: <Lists> <List Title="Documents" Description="Documents " url="MyDocumentLibrary" Type="10002" FeatureId="CFD8504D-70EB-4ba2-9CCB-52E38DB39E60" QuickLaunchUrl="Docs/AllItems.aspx" /> </Lists> And I ensure that the feature for this list is also activated be adding the feature to the <WebFeatures> <Feature ID="CFD8504D-70EB-4ba2-9CCB-52E38DB39E60" /> </WebFeatures> The problem occurs after I create the site, when I add a document to the list the Item Event Receiver does not run. However if I manually for to the web site features and deactivate and then reactivate the feature the Item Event Receiver does run. It seems that when creating a list through the ONET.xml and activating the feature it does not bind the Item Event Receiver to the list. What is the work around for this? Is this a bug?

    Read the article

  • Are there solutions for streamlining the update of legacy code in multiple places?

    - by ccomet
    I'm working in some old code which was originally designed for handling two different kinds of files. I was recently tasked with adding a new kind of file to this code. Most of my problems were solved by filling out an extensive XML file with a new entry that handled everything from what lists were named to how the file is written in plural lower case. But this ended up being insufficient, as there were maybe 50 different places in 24 different code files where I had to update hardcoded switch-statements that only branched for the original two file types. Unfortunately there is no consistency in this; there are methods which operate half from the XML file, and half off of hardcode. Some of the files which look like they would operate off of the XML file don't, and some that I would expect that I'd need to update the hardcode don't need it. So the only way to find the majority of these is to run through testing the whole system when only part of it is operational, finding that one step to fix (when I'm lucky that error logging actually tells me what is going on), and then running the whole thing again. This wastes time testing the parts of the code which are already confirmed to work, time better spent testing the new parts I have to add on top of it all. It's a hassle and a half, and to my luck I can expect that I will have to add yet another new kind of file in the near future. Are there any solutions out there which can aid in this kind of endeavour? Something which I can input some parameters of current features, document what points in a whole code project actually need to be updated, and run something nice the next time I need to add a new feature to the code. It needn't even be fully automated, something that'll help me navigate straight to the specific points in everything and maybe even record what kind of parameters need to be loaded. Doubt it matters specifically, but the code is comprised of ASP.NET pages, some ASP.NET controls, hundreds of C# code files, and a handful of additional XML files. It's all currently in a couple big Visual Studio 2008 projects.

    Read the article

  • Is .NET 4.0 just a show?

    - by Will Marcouiller
    I went to a presentation about the .NET Framework and Visual Studio 2010, last night. The topis were: ASP.NET 4 - Some of the new features of ASP.NET 4 More control over ClientID's in WebForms; Output Caching; ... // Some other stuff I don't really remember being more in framework and WinForms world. Entity Framework 2.0 (.NET 4.0) T4 Templates; Domain driven development; Data driven development; Contexts (edmx files); Some of real-world limitations of EF4 (projects with over 70 to 75 tables); Better POCO support, despite there are still these hidden EntityObject and StructuralObject, but used differently in comparison to EF 1.0 so that it doesn't take off your inheritance; Allows to easily choose how to persist the hierarchy into the underlying database; Code only (start working with EF4 directly from your code!); Design by Contract (DbC). The most interesting feature is, and only, as far as I'm concerned, all related to parallelism made easier. Which really works! No additional assembly references to add. In conclusion, I'm far from impressed about .NET Framework 4.0, apart that it makes some things easier to do. But when you're used to make it a way, it doesn't really change much, in my opinion. Is it me who cannot foresee what .NET 4.0 has to offer? What would you guys base your decision on to migrate to .NET 4.0, in a practical way?

    Read the article

  • Is it wise to ask about design decisions made on a product during an interview?

    - by Desolate Planet
    I've been thinking about interview questions lately and I've been reflecting on bad interview experiences I've had in the past. One of particular note is where I had asked the interviewer why the team chose to use Spring over EJB3 in their product. The interviewer pretty much tore my face off, yelling "Because Spring is not the be all and end all of Java software development, do you want this job or not?". In response to this, I told him that this probably wasn't the job for me and I walked out the interview. He told me at the start of the interview that they had high stuff turnover, the product had gone from Modula 3 to Perl to Java then after asking him a technical question, he went in flames. It seemed obvious to me that he was toxic to the company with that kind of attitude. Question: Is it a good idea to probe on architectural choices taken in an interview? If not, why? From my own point of view, an interview is a two-way process. If the interviewers are testing me on my technical skills, I've got every right to ask them the same questions to 1) Figure out what their mindset and attitudes towards developing software solutions are and 2) To figure out if there are in line with how I would approach problems of that kind. It's very possible that the interviewer who got angry was a bad interviewer and forgot that an interview is a two-way process. If I was asked this, I would have simply said something along the lines of wanting to leverage the container more, but I certainly wouldn't have tried to put him in a state of meek capitulation. The interviewer in question was the lead developer in the team.

    Read the article

  • I'm tasked with leading the documentation effort for an existing, entirely undocumented, software product - what resources are there to help me?

    - by Ben Rose
    I'm a software developer at a technology company. I have been tasked with leading the documentation effort for the product I work on. The goal is to produce documentation internal to developer, and the project spills over into the business side, where it covers requirements documentation. This project is challenging. Specifically, I'm dealing with a product which: - has been around for a long time, at least 6 years. - has no form of documentation other than some small, outdated pieces here and there. - has comments in the code, but they are technical and do not convey any over-arching behavior (even on technical side). - as a consequence of having little to no documentation, is often unnecessarily complex under the covers In addition, we have not been given a lot of time to work on this project. I do not have any formal documentation or writing background, training, or experience. I have displayed some ability in writing/communication around the office, which may be why I was assigned to this project. Please share your advice or recommendation for resources to help me prepare and deal with this project. I'm looking for references to books/website/forums/whatever, to help me come up with the design of a plan with milestones, learn about best practices, task delegation, templates, buy-in, etc. I'm hoping specifically for resources targeting or giving special mention of introducing good documentation to existing, undocumented, projects. I would be very grateful for your responses. Ben

    Read the article

  • Pagination for product listing, what to use? "canonical" or "rel-prev-next" or do nothing?

    - by Jayapal Chandran
    I want to make sure my product listing is 10 products per page which are not in a series (link). They have explained how to use canonical or rel prev for pagination when a long page has been divided into multiple page and the multiple pages becomes a series were as my condition is not that. They are unique listing which are not related to each listing... All the listing links leads to a product profile page. So lets say my site is all about cars and I have a Used Audi page with 1000 Audi's for sale. There are 10 used audi cars on each page so there's 100 pages in the series. If I start to utilise Rel="prev" and rel="next" should I set page 2 onwards as index,follow or noindex,follow? The content on Page 2 all the way to 100 only changes ever so slightly as different cars will be for sale on different pages but from a "Panda" point of view the pages are incredibly similar as they'd hold the same meta data as page 1 in the series along with duplicate reviews & news etc. I want Page 1 in the series as the Main page for Google to send users too and I don't see the point in Google indexing page 2 100. What's everyone's view on this? Lastly with the rel="canonical" tag should page 2 to 100 all point back to page 1 in the series or the individual page itself? E.G: /used-audi/page-3/.

    Read the article

  • Using C++ but not using the language's specific features, should switch to C?

    - by Petruza
    I'm developing a NES emulator as a hobby, in my free time. I use C++ because is the language I use mostly, know mostly and like mostly. But now that I made some advance into the project I realize I'm not using almost any specific features of C++, and could have done it in plain C and getting the same result. I don't use templates, operator overloading, polymorphism, inheritance. So what would you say? should I stay in C++ or rewrite it in C? I won't do this to gain in performance, it could come as a side effect, but the idea is why should I use C++ if I don't need it? The only features of C++ I'm using is classes to encapsulate data and methods, but that can be done as well with structs and functions, I'm using new and delete, but could as well use malloc and free, and I'm using inheritance just for callbacks, which could be achieved with pointers to functions. Remember, it's a hobby project, I have no deadlines, so the overhead time and work that would require a re-write are not a problem, might be fun as well. So, the question is C or C++?

    Read the article

  • Problem: adding feature in MOSS 2007

    - by Anoop
    Hi All, I have added a menu item in 'Actions' menu of a document library as follows(Using features: explained in MSDN How to: Add Actions to the User Interface: http://msdn.microsoft.com/en-us/library/ms473643.aspx): feature.xml as follows: <?xml version="1.0" encoding="utf-8" ?> <Feature Id="51DEF381-F853-4991-B8DC-EBFB485BACDA" Title="Import From REP" Description="This example shows how you can customize various areas inside Windows SharePoint Services." Version="1.0.0.0" Scope="Site" xmlns="http://schemas.microsoft.com/sharepoint/"> <ElementManifests> <ElementManifest Location="ImportFromREP.xml" /> </ElementManifests> </Feature> ImportFromREP.xml is as follows: <?xml version="1.0" encoding="utf-8"?> <Elements xmlns="http://schemas.microsoft.com/sharepoint/"> <!-- Document Library Toolbar Actions Menu Dropdown --> <CustomAction Id="ImportFromREP" RegistrationType="List" RegistrationId = "101" GroupId="ActionsMenu" Rights="AddListItems" Location="Microsoft.SharePoint.StandardMenu" Sequence="1000" Title="Import From REP"> <UrlAction Url="/_layouts/ImportFromREP.aspx?ActionsMenu"/> </CustomAction> </Elements> I have successfully installed and activated the feature. Normal case: Now if i login with the user having atleast 'Contribute' permission on the document library, i am able to see the menu item 'Import From REP' (in 'Actions' menu)in root folder as well as in all subfolders. Problem case: If user is having view rights on the document library but add/delete rights on a perticular subfolder(say 'subfolder') inside the document library: Menu item 'Import From REP' is not visible on root folder level as well as 'Upload' menu is also not visible. which is o.k. because user does not have AddListItems right on root folder but it is also not visible in 'subfolder' while 'Upload' menu is visible as user has add/delete rights on the 'subfolder'. did i mention a wrong value of attribute Rights(Rights="AddListItems") in the xml file?? If so, what should be the correct value? What should i do to simulate the behaviour of 'Upload' menu??(i.e. when 'Upload' menu is visible, 'Import From REP' menu is visible and when 'Upload' menu is not visible 'Import From REP' menu is also not visible. ) Thanks in advance Anoop

    Read the article

  • Oracle Fusion Middleware 11g next launch phase - what a week of product releases! Feedback from our

    - by Jürgen Kress
      Product releases: SOA Suite 11gR1 Patch Set 2 (PS2) BPM Suite 11gR1 Released Oracle JDeveloper 11g (11.1.1.3.0) (Build 5660) Oracle WebLogic Server 11gR1 (10.3.3) Oracle JRockit (4.0) Oracle Tuxedo 11gR1 (11.1.1.1.0) Enterprise Manager 11g Grid Control Release 1 (11.1.0.1.0) for Linux x86/x86-64 All Oracle Fusion Middleware 11gR1 Software Download   BPM Suite 11gR1 Released by Manoj Das Oracle BPM Suite 11gR1 became available for download from OTN and eDelivery. If you have been following our plans in this area, you know that this is the release unifying BEA ALBPM product, which became Oracle BPM10gR3, with the Oracle stack. Some of the highlights of this release are: BPMN 2.0 modeling and simulation Web based Process Composer for BPMN and Rules authoring Zero-code environment with full access to Oracle SOA Suite’s rich set of application and other adapters Process Spaces – Out-of-box integration with Web Center Suite Process Analytics – Native process cubes as well as integration with Oracle BAM You can learn more about this release from the documentation. Notes about downloading and installing Please note that Oracle BPM Suite 11gR1 is delivered and installed as part of SOA 11.1.1.3.0, which is a sparse release (only incremental patch). To install: Download and install SOA 11.1.1.2.0, which is a full release (you can find the bits at the above location) Download and install SOA 11.1.1.3.0 During configure step (using the Fusion Middleware configuration wizard), use the Oracle Business Process Management template supplied with the SOA Suite11g (11.1.1.3.0) If you plan to use Process Spaces, also install Web Center 11.1.1.3.0, which also is delivered as a sparse release and needs to be installed on top of Web Center 11.1.1.2.0   SOA Suite 11gR1 Patch Set 2 (PS2) released by Demed L'Her We just released SOA Suite 11gR1 Patch Set 2 (PS2)! You can download it as usual from: OTN (main platforms only) eDelivery (all platforms) 11gR1 PS2 is delivered as a sparse installer, that is to say that it is meant to be applied on the latest full install (11gR1 PS1). That’s great for existing PS1 users who simply need to apply the patch and run the patch assistant – but an extra step for new users who will first need to download SOA Suite 11gR1 PS1 (in addition to the PS2 patch). What’s in that release? Bug fixes of course but also several significant new features. Here is a short selection of the most significant ones: Spring component (for native Java extensibility and integration) SOA Partitions (to organize and manage your composites) Direct Binding (for transactional invocations to and from Oracle Service Bus) HTTP binding (for those of you trying to do away with SOAP and looking for simple GET and POST) Resequencer (for ordering out-of-order messages) WS Atomic Transactions (WS-AT) support (for propagation of transactions across heterogeneous environments) Check out the complete list of new features in PS2 for more (including links to the documentation for the above)! But maybe even more importantly we are also releasing Oracle Service Bus 11gR1 and BPM Suite 11gR1 at the same time – all on the same base platform (WebLogic Server 10.3.3)! (NB: it might take a while for all pages and caches to be updated with the new content so if you don’t find what you need today, try again soon!)   Are you Systems Integrations and Independent Software Vendors ready to adopt and to deliver? Make sure that you become trained: Local training calendars Register for the SOA Partner Community & Webcast www.oracle.com/goto/emea/soa What is your feedback?  Who installed the software? please feel free to share your experience at http://twitter.com/soacommunity #soacommunity Technorati Tags: SOA partner community ACE Directoris SOA Suite PS2 BPM11g First feedback from our ACE Directors and key Partners:   Now, these are great times to start the journey into BPM! Hajo Normann Reuse of components across the Oracle 11G Fusion Middleware stack, BPM just is one of the components plugging into the stack and reuses all other components. Mr. Leon Smiers With BPM11g, Oracle offers a very competitive product which will have a big effect on the IT market. Guido Schmutz We have real BPMN 2.0, which get's executed. No more transformation from business models to executable models - just press the run button... Torsten Winterberg Oracle BPM Suite 11g brings Out-of-box integration with WebCenter Suite and Oracle ADF development framework. Andrejus Baranovskis With the release of BPM Suite 11g, Oracle has defined new standards for Business Process platforms. Geoffroy de Lamalle With User Messaging Service you can let Soa Suite 11g do all your Messaging Edwin Biemond

    Read the article

  • How can I save BioPerl sequence nested features in genbank or embl format?

    - by Ryan Thompson
    In BioPerl, a sequence object can have any number of features, and each of these can have subfeatures nested within them. For example, a feature may be a complete coding sequence of a gene, and its subfeatures might be individual exons that are concatenated to form the full coding sequence. However, when I use BioPerl to write a sequence object to a file in genbank or embl format, only the top-level features are written to the file, not the sub-features nested within the top-level features. How can I store my subfeatures in sequence files? Should I just convert all my subfeatures into top-level features, and then reconstruct the tree structure next time I read in the sequence?

    Read the article

  • how to maintain multiple components for multiple client for multiple features?

    - by Dhana
    Basically my project is product based. Once we developed a project and catch the multiple client and deploy the application based on their needs. But We decided to put the new features and project dependent modules are as component. Now my application got many number of customer. Every customer needs a different features based on the component. But we have centralized component for all client . we move the components additional feature to client specific folder and deploy. My problem is , I am unable maintain the components features for multiple client. My component feature code is increased and I am unable to track the client features. Is there any solution for maintaining the multiple component features for multiple client ?

    Read the article

  • Why C# doesn't implement indexed properties ?

    - by Thomas Levesque
    I know, I know... Eric Lippert's answer to this kind of question is usually something like "because it wasn't worth the cost of designing, implementing, testing and documenting it". But still, I'd like a better explanation... I was reading this blog post about new C# 4 features, and in the section about COM Interop, the following part caught my attention : By the way, this code uses one more new feature: indexed properties (take a closer look at those square brackets after Range.) But this feature is available only for COM interop; you cannot create your own indexed properties in C# 4.0. OK, but why ? I already knew and regretted that it wasn't possible to create indexed properties in C#, but this sentence made me think again about it. I can see several good reasons to implement it : the CLR supports it (for instance, PropertyInfo.GetValue has an index parameter), so it's a pity we can't take advantage of it in C# it is supported for COM interop, as shown in the article (using dynamic dispatch) it is implemented in VB.NET it is already possible to create indexers, i.e. to apply an index to the object itself, so it would probably be no big deal to extend the idea to properties, keeping the same syntax and just replacing this with a property name It would allow to write that kind of things : public class Foo { private string[] _values = new string[3]; public string Values[int index] { get { return _values[index]; } set { _values[index] = value; } } } Currently the only workaround that I know is to create an inner class (ValuesCollection for instance) that implements an indexer, and change the Values property so that it returns an instance of that inner class. This is very easy to do, but annoying... So perhaps the compiler could do it for us ! An option would be to generate an inner class that implements the indexer, and expose it through a public generic interface : // interface defined in the namespace System public interface IIndexer<TIndex, TValue> { TValue this[TIndex index] { get; set; } } public class Foo { private string[] _values = new string[3]; private class <>c__DisplayClass1 : IIndexer<int, string> { private Foo _foo; public <>c__DisplayClass1(Foo foo) { _foo = foo; } public string this[int index] { get { return _foo._values[index]; } set { _foo._values[index] = value; } } } private IIndexer<int, string> <>f__valuesIndexer; public IIndexer<int, string> Values { get { if (<>f__valuesIndexer == null) <>f__valuesIndexer = new <>c__DisplayClass1(this); return <>f__valuesIndexer; } } } But of course, in that case the property would actually return a IIndexer<int, string>, and wouldn't really be an indexed property... It would be better to generate a real CLR indexed property. What do you think ? Would you like to see this feature in C# ? If not, why ?

    Read the article

  • XSLT Exclude Specific Tags

    - by MMKD
    I have a problem i am trying to resolve in an XSLT but i an unable to find a solution. The example below is related to a payment system it is adding items to a basket then removing them. The out XML provides a audit trail of actions conducted on a basket. Senario: Add Item (Id 1) Add Item (Id 1) With a price change Void Item (Id 1) Void Item (Id 1) With a price change Add Item (Id 1) Add Item (Id 1) Expected Outcome Remove: Add Item (Id 1) Add Item (Id 1) With a price change Output XML Contains Void Item (Id 1) Void Item (Id 1) With a price change Add Item (Id 1) Add Item (Id 1) Input XML: <xml> <product void="false"> <sequence_number>1</sequence_number> <item_id>11111111</item_id> <price>12</price> </product> <product void="false"> <sequence_number>2</sequence_number> <item_id>11111111</item_id> <price>12</price> <price_change> <price>10</price> </price_change> </product> <product void="true"> <sequence_number>3</sequence_number> <item_id>11111111</item_id> <price>12</price> <price_change> <price>10</price> </price_change> </product> <product void="true"> <sequence_number>4</sequence_number> <item_id>11111111</item_id> <price>12</price> </product> <product void="false"> <sequence_number>5</sequence_number> <item_id>11111111</item_id> <price>12</price> </product> <product void="false"> <sequence_number>6</sequence_number> <item_id>11111111</item_id> <price>12</price> </product> </xml> Expected outcome: <xml> <product void="true"> <sequence_number>3</sequence_number> <item_id>11111111</item_id> <price>12</price> <price_change> <price>10</price> </price_change> </product> <product void="true"> <sequence_number>4</sequence_number> <item_id>11111111</item_id> <price>12</price> </product> <product void="false"> <sequence_number>5</sequence_number> <item_id>11111111</item_id> <price>12</price> </product> <product void="false"> <sequence_number>6</sequence_number> <item_id>11111111</item_id> <price>12</price> </product> </xml> XSLT <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:template match="@*|node()"> <xsl:copy> <xsl:apply-templates select="@*|node()"/> </xsl:copy> </xsl:template> <xsl:template match="//product[@void='false']"> <xsl:if test="item_id != //product[@void='true']/item_id"> <xsl:copy> <xsl:apply-templates select="@*|node()"/> </xsl:copy> </xsl:if> </xsl:template> </xsl:stylesheet> The problem with this is that it is deleting all products that are not voided and have the same id and not taking into account the number of void items vs the number of none void items. If you have 1 void item it should only delete one product that is not voided but has exactly the same tags as itself

    Read the article

  • Need a non-Mac keyboard alternative for Mac users

    - by bLee
    I'm a Macbook Pro user. I like Apple products in general, but I hate their keyboards. I work on my computer/laptop 10+ hours a day, so I would like to have a keyboard that is Mac-compatible and ergonomic. I found this one on Amazon. It is specifically made for Mac. However, I would like to get some suggestions from people who found good keyboards for Mac.

    Read the article

  • What is the Oracle Utilities Application Framework?

    - by Anthony Shorten
    The Oracle Utilities Application Framework is a reusable, scalable and flexible java based framework which allows other products to be built, configured and implemented in a standard way. Note: Even though the Framework is built in java it can be integrated with COBOL based extensions for backward compatibility. When Oracle Utilities Customer Care & Billing was migrated from V1 to V2, it was decided that the technical aspects of that product be separated to allow for reuse and independence from technical issues. The idea was that all the technical aspects would be concentrated in this separate product (i.e. a framework) and allow all products using the framework to concentrate on delivering superior functionality. The product was named the Oracle Utilities Application Framework (oufw is the product code). The technical components are contained in the Oracle Utilities Application Framework which can be summarized as follows: Metadata - The Oracle Utilities Application Framework is responsible for defining and using the metadata to define the runtime behavior of the product. All the metadata definition and management is contained within the Oracle Utilities Application Framework. UI Management - The Oracle Utilities Application Framework is responsible for defining and rendering the pages and responsible for ensuring the pages are in the appropriate format for the locale. Integration - The Oracle Utilities Application Framework is responsible for providing the integration points to the architecture. Refer to the Oracle Utilities Application Framework Integration Overview for more details Tools - The Oracle Utilities Application Framework provides a common set of facilities and tools that can be used across all products. Technology - The Oracle Utilities Application Framework is responsible for all technology standards compliance, platform support and integration. There are a number of products from the Tax and Utilities Global Business Unit as well as from the Financial Services Global Business Unit that are built upon the Oracle Utilities Application Framework. These products require the Oracle Utilities Application Framework to be installed first and then the product itself installed onto the framework to complete the installation process. There are a number of key benefits that the Oracle Utilities Application Framework provides to these products: Common facilities - The Oracle Utilities Application Framework provides a standard set of technical facilities that mean that products can concentrate in the unique aspects of their markets rather than making technical decisions. Common methods of configuration - The Oracle Utilities Application Framework standardizes the technical configuration process for a product. Customers can effectively reuse the configuration process across products. Multi-lingual and Multi-platform - The Oracle Utilities Application Framework allows the products to be offered in more markets and across multiple platforms for maximized flexibility. Common methods of implementation - The Oracle Utilities Application Framework standardizes the technical aspects of a product implementation. Customers can effectively reuse the technical implementation process across products. Quicker adoption of new technologies - As new technologies and standards are identified as being important for the product line, they can be integrated centrally benefiting multiple products. Cross product reuse - As enhancements to the Oracle Utilities Application Framework are identified by a particular product, all products can potentially benefit from the enhancement. Note: Use of the Oracle Utilities Application Framework does not preclude the introduction of product specific technologies or facilities to satisfy market needs. The framework minimizes the need and assists in the quick integration of a new product specific piece of technology (if necessary). The Framework is not available as a product itself and is bundled with Tax and Utilities Global Business Unit prodicts. At the present time the following products are on the Framework: Oracle Utilities Customer Care And Billing (V2 and above) Oracle Enterprise Taxation Management (V2 and above) Oracle Utilities Business Intelligence (V2 and above) Oracle Utilities Mobile Workforice Management (V2 and above)

    Read the article

  • Unlimited and multi-computer online storage solutions with automatic backup

    - by JRL
    As the title says, what are the existing online storage solutions that provide: unlimited storage automatic backup and allow for an unlimited number of computers (use not tied to a single computer)? There are several existing questions on this site related to online storage solutions, but none that is specifically targeted to what I want, so I thought I'd ask the question. This wikipedia article lists some of them, are there others? How do they compare in terms of price, feature set and ease of use? Update: Kinda disappointed no one has any answers to this so far. JungleDisk looks promising, anyone have experience with it? Update 2: To answer the comments, what I'm looking for definitely DOES exist. These solutions all seem to fit the bill: BackMii CrashPlan DataPreserve Humyo JungleDisk KeepVault SpiderOak And some of them are quite cheap (CrashPlan is $100 a year). For unlimited space and computers, I'd say that's pretty good. Does anyone have experience with CrashPlan or any other of the above solutions?

    Read the article

  • Understanding LINQ to SQL (11) Performance

    - by Dixin
    [LINQ via C# series] LINQ to SQL has a lot of great features like strong typing query compilation deferred execution declarative paradigm etc., which are very productive. Of course, these cannot be free, and one price is the performance. O/R mapping overhead Because LINQ to SQL is based on O/R mapping, one obvious overhead is, data changing usually requires data retrieving:private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (NorthwindDataContext database = new NorthwindDataContext()) { Product product = database.Products.Single(item => item.ProductID == id); // SELECT... product.UnitPrice = unitPrice; // UPDATE... database.SubmitChanges(); } } Before updating an entity, that entity has to be retrieved by an extra SELECT query. This is slower than direct data update via ADO.NET:private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (SqlConnection connection = new SqlConnection( "Data Source=localhost;Initial Catalog=Northwind;Integrated Security=True")) using (SqlCommand command = new SqlCommand( @"UPDATE [dbo].[Products] SET [UnitPrice] = @UnitPrice WHERE [ProductID] = @ProductID", connection)) { command.Parameters.Add("@ProductID", SqlDbType.Int).Value = id; command.Parameters.Add("@UnitPrice", SqlDbType.Money).Value = unitPrice; connection.Open(); command.Transaction = connection.BeginTransaction(); command.ExecuteNonQuery(); // UPDATE... command.Transaction.Commit(); } } The above imperative code specifies the “how to do” details with better performance. For the same reason, some articles from Internet insist that, when updating data via LINQ to SQL, the above declarative code should be replaced by:private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (NorthwindDataContext database = new NorthwindDataContext()) { database.ExecuteCommand( "UPDATE [dbo].[Products] SET [UnitPrice] = {0} WHERE [ProductID] = {1}", id, unitPrice); } } Or just create a stored procedure:CREATE PROCEDURE [dbo].[UpdateProductUnitPrice] ( @ProductID INT, @UnitPrice MONEY ) AS BEGIN BEGIN TRANSACTION UPDATE [dbo].[Products] SET [UnitPrice] = @UnitPrice WHERE [ProductID] = @ProductID COMMIT TRANSACTION END and map it as a method of NorthwindDataContext (explained in this post):private static void UpdateProductUnitPrice(int id, decimal unitPrice) { using (NorthwindDataContext database = new NorthwindDataContext()) { database.UpdateProductUnitPrice(id, unitPrice); } } As a normal trade off for O/R mapping, a decision has to be made between performance overhead and programming productivity according to the case. In a developer’s perspective, if O/R mapping is chosen, I consistently choose the declarative LINQ code, unless this kind of overhead is unacceptable. Data retrieving overhead After talking about the O/R mapping specific issue. Now look into the LINQ to SQL specific issues, for example, performance in the data retrieving process. The previous post has explained that the SQL translating and executing is complex. Actually, the LINQ to SQL pipeline is similar to the compiler pipeline. It consists of about 15 steps to translate an C# expression tree to SQL statement, which can be categorized as: Convert: Invoke SqlProvider.BuildQuery() to convert the tree of Expression nodes into a tree of SqlNode nodes; Bind: Used visitor pattern to figure out the meanings of names according to the mapping info, like a property for a column, etc.; Flatten: Figure out the hierarchy of the query; Rewrite: for SQL Server 2000, if needed Reduce: Remove the unnecessary information from the tree. Parameterize Format: Generate the SQL statement string; Parameterize: Figure out the parameters, for example, a reference to a local variable should be a parameter in SQL; Materialize: Executes the reader and convert the result back into typed objects. So for each data retrieving, even for data retrieving which looks simple: private static Product[] RetrieveProducts(int productId) { using (NorthwindDataContext database = new NorthwindDataContext()) { return database.Products.Where(product => product.ProductID == productId) .ToArray(); } } LINQ to SQL goes through above steps to translate and execute the query. Fortunately, there is a built-in way to cache the translated query. Compiled query When such a LINQ to SQL query is executed repeatedly, The CompiledQuery can be used to translate query for one time, and execute for multiple times:internal static class CompiledQueries { private static readonly Func<NorthwindDataContext, int, Product[]> _retrieveProducts = CompiledQuery.Compile((NorthwindDataContext database, int productId) => database.Products.Where(product => product.ProductID == productId).ToArray()); internal static Product[] RetrieveProducts( this NorthwindDataContext database, int productId) { return _retrieveProducts(database, productId); } } The new version of RetrieveProducts() gets better performance, because only when _retrieveProducts is first time invoked, it internally invokes SqlProvider.Compile() to translate the query expression. And it also uses lock to make sure translating once in multi-threading scenarios. Static SQL / stored procedures without translating Another way to avoid the translating overhead is to use static SQL or stored procedures, just as the above examples. Because this is a functional programming series, this article not dive into. For the details, Scott Guthrie already has some excellent articles: LINQ to SQL (Part 6: Retrieving Data Using Stored Procedures) LINQ to SQL (Part 7: Updating our Database using Stored Procedures) LINQ to SQL (Part 8: Executing Custom SQL Expressions) Data changing overhead By looking into the data updating process, it also needs a lot of work: Begins transaction Processes the changes (ChangeProcessor) Walks through the objects to identify the changes Determines the order of the changes Executes the changings LINQ queries may be needed to execute the changings, like the first example in this article, an object needs to be retrieved before changed, then the above whole process of data retrieving will be went through If there is user customization, it will be executed, for example, a table’s INSERT / UPDATE / DELETE can be customized in the O/R designer It is important to keep these overhead in mind. Bulk deleting / updating Another thing to be aware is the bulk deleting:private static void DeleteProducts(int categoryId) { using (NorthwindDataContext database = new NorthwindDataContext()) { database.Products.DeleteAllOnSubmit( database.Products.Where(product => product.CategoryID == categoryId)); database.SubmitChanges(); } } The expected SQL should be like:BEGIN TRANSACTION exec sp_executesql N'DELETE FROM [dbo].[Products] AS [t0] WHERE [t0].[CategoryID] = @p0',N'@p0 int',@p0=9 COMMIT TRANSACTION Hoverer, as fore mentioned, the actual SQL is to retrieving the entities, and then delete them one by one:-- Retrieves the entities to be deleted: exec sp_executesql N'SELECT [t0].[ProductID], [t0].[ProductName], [t0].[SupplierID], [t0].[CategoryID], [t0].[QuantityPerUnit], [t0].[UnitPrice], [t0].[UnitsInStock], [t0].[UnitsOnOrder], [t0].[ReorderLevel], [t0].[Discontinued] FROM [dbo].[Products] AS [t0] WHERE [t0].[CategoryID] = @p0',N'@p0 int',@p0=9 -- Deletes the retrieved entities one by one: BEGIN TRANSACTION exec sp_executesql N'DELETE FROM [dbo].[Products] WHERE ([ProductID] = @p0) AND ([ProductName] = @p1) AND ([SupplierID] IS NULL) AND ([CategoryID] = @p2) AND ([QuantityPerUnit] IS NULL) AND ([UnitPrice] = @p3) AND ([UnitsInStock] = @p4) AND ([UnitsOnOrder] = @p5) AND ([ReorderLevel] = @p6) AND (NOT ([Discontinued] = 1))',N'@p0 int,@p1 nvarchar(4000),@p2 int,@p3 money,@p4 smallint,@p5 smallint,@p6 smallint',@p0=78,@p1=N'Optimus Prime',@p2=9,@p3=$0.0000,@p4=0,@p5=0,@p6=0 exec sp_executesql N'DELETE FROM [dbo].[Products] WHERE ([ProductID] = @p0) AND ([ProductName] = @p1) AND ([SupplierID] IS NULL) AND ([CategoryID] = @p2) AND ([QuantityPerUnit] IS NULL) AND ([UnitPrice] = @p3) AND ([UnitsInStock] = @p4) AND ([UnitsOnOrder] = @p5) AND ([ReorderLevel] = @p6) AND (NOT ([Discontinued] = 1))',N'@p0 int,@p1 nvarchar(4000),@p2 int,@p3 money,@p4 smallint,@p5 smallint,@p6 smallint',@p0=79,@p1=N'Bumble Bee',@p2=9,@p3=$0.0000,@p4=0,@p5=0,@p6=0 -- ... COMMIT TRANSACTION And the same to the bulk updating. This is really not effective and need to be aware. Here is already some solutions from the Internet, like this one. The idea is wrap the above SELECT statement into a INNER JOIN:exec sp_executesql N'DELETE [dbo].[Products] FROM [dbo].[Products] AS [j0] INNER JOIN ( SELECT [t0].[ProductID], [t0].[ProductName], [t0].[SupplierID], [t0].[CategoryID], [t0].[QuantityPerUnit], [t0].[UnitPrice], [t0].[UnitsInStock], [t0].[UnitsOnOrder], [t0].[ReorderLevel], [t0].[Discontinued] FROM [dbo].[Products] AS [t0] WHERE [t0].[CategoryID] = @p0) AS [j1] ON ([j0].[ProductID] = [j1].[[Products])', -- The Primary Key N'@p0 int',@p0=9 Query plan overhead The last thing is about the SQL Server query plan. Before .NET 4.0, LINQ to SQL has an issue (not sure if it is a bug). LINQ to SQL internally uses ADO.NET, but it does not set the SqlParameter.Size for a variable-length argument, like argument of NVARCHAR type, etc. So for two queries with the same SQL but different argument length:using (NorthwindDataContext database = new NorthwindDataContext()) { database.Products.Where(product => product.ProductName == "A") .Select(product => product.ProductID).ToArray(); // The same SQL and argument type, different argument length. database.Products.Where(product => product.ProductName == "AA") .Select(product => product.ProductID).ToArray(); } Pay attention to the argument length in the translated SQL:exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(1)',@p0=N'A' exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(2)',@p0=N'AA' Here is the overhead: The first query’s query plan cache is not reused by the second one:SELECT sys.syscacheobjects.cacheobjtype, sys.dm_exec_cached_plans.usecounts, sys.syscacheobjects.[sql] FROM sys.syscacheobjects INNER JOIN sys.dm_exec_cached_plans ON sys.syscacheobjects.bucketid = sys.dm_exec_cached_plans.bucketid; They actually use different query plans. Again, pay attention to the argument length in the [sql] column (@p0 nvarchar(2) / @p0 nvarchar(1)). Fortunately, in .NET 4.0 this is fixed:internal static class SqlTypeSystem { private abstract class ProviderBase : TypeSystemProvider { protected int? GetLargestDeclarableSize(SqlType declaredType) { SqlDbType sqlDbType = declaredType.SqlDbType; if (sqlDbType <= SqlDbType.Image) { switch (sqlDbType) { case SqlDbType.Binary: case SqlDbType.Image: return 8000; } return null; } if (sqlDbType == SqlDbType.NVarChar) { return 4000; // Max length for NVARCHAR. } if (sqlDbType != SqlDbType.VarChar) { return null; } return 8000; } } } In this above example, the translated SQL becomes:exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(4000)',@p0=N'A' exec sp_executesql N'SELECT [t0].[ProductID] FROM [dbo].[Products] AS [t0] WHERE [t0].[ProductName] = @p0',N'@p0 nvarchar(4000)',@p0=N'AA' So that they reuses the same query plan cache: Now the [usecounts] column is 2.

    Read the article

  • i5 vs. i7 processor dev laptop

    - by vector
    Greetings! I need to get a laptop for dev work ( mostly server side Java, NetBeans ) and wonder if anyone had a chance to use either the i5 or i7 based laptop? Is the i7 an overkill? ... or will the i5 handle it just fine? I'm thinking something from the HP line running Ubuntu. Thanks

    Read the article

  • Macbook Pro 2.66 GHz vs. 2.8 GHz

    - by nevan
    Is there much advantage in getting the higher end Macbook Pro compared to the mid-range one? The differences between the two are: 2.66 GHz vs. 2.8 GHz 256 MB graphics memory vs. 512 MB 3 MB L2 cache vs. 6 MB 320 GB hard drive vs. 500 GB $2000 vs. $2300 I've looked around, but I can't find any direct comparisons for the two machines. I'd be using the machine for development. I generally use a computer for 3 years. I don't really play games, but do use Photoshop regularly. I've heard that once Snow Leopard arrives, the graphics chip will be used to boost the main processor, so I was wondering if getting the one with more graphics memory would be an advantage?

    Read the article

< Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >