Search Results

Search found 19352 results on 775 pages for 'product management'.

Page 18/775 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • New AIA Product Information Center (PIC) pages

    - by Lionel Dubreuil
    Did You Know? The AIA Support team has spent the last few months reformatting our Product Master Notes into a brand new Product Information Center (PIC) format! The PIC pages contains the links to documentation, new and most popular documents, alerts, FAQs, recommended patches and the certification matrix. Our main PIC page has all the PIP and Foundation Pack PIC notes listed on it for easier navigation. Take a look and tell us what you think!  You can email [email protected] with your feedback!

    Read the article

  • What is the best agile project management technique for developing innovative software systems?

    - by user654019
    I am involved with the development of innovative software. The development is innovative since we don't know how to develop it and what algorithm should we use to implement and nobody else did it before. The process consists of several stages of studying books/papers, suggesting algorithms, writing prototypes and comparing the result with actual data. We hope that after some iteration, we converge to a valid software system. What is the best project management approach that we can use? Is there any project management software for these types of projects?

    Read the article

  • Announcing General Availability of the E-Business Suite Plug-in

    - by Kenneth E.
    Oracle E-Business Suite Application Technology Group (ATG) is pleased to announce the General Availability of Oracle E-Business Suite Plug-in 12.1.0.1.0, an integral part of Application Management Suite for Oracle E-Business Suite.The combination of Enterprise Manager 12c Cloud Control and the Application Management Suite combines functionality that was available in the standalone Application Management Pack for Oracle E-Business Suite and Application Change Management Pack for Oracle E-Business Suite with Oracle’s Real User Experience Insight product and the Configuration & Compliance capabilities to provide the most complete solution for managing Oracle E-Business Suite applications. The features that were available in the standalone management packs are now packaged into the Oracle E-Business Suite Plug-in, which is now fully certified with Oracle Enterprise Manager 12c Cloud Control. This latest plug-in extends Cloud Control with E-Business Suite specific system management capabilities and features enhanced change management support.Here is all the information you need to get started:EBS Plug-in 12.1.0.1.0 info -Full Announcement•    E-Business Suite Plug-in 12.1.0.1 for Enterprise Manager 12c Now Available MOS -•    Getting Started with Oracle E-Business Suite Plug-in, Release 12.1.0.1.0 (Doc ID 1434392.1)Documentation -•    Oracle Application Management Pack for Oracle E-Business Suite Guide, Release 12.1.0.1.0Certification•    Platforms and OS Release certification information is available from My Oracle Support via the Certification page. •    Search using the official trademark name Oracle Application Management Pack for Oracle E-Business Suite and Release 12.1.0.1.0

    Read the article

  • How to decide on going into management?

    - by Rob Wells
    I read the transcript of a speech by Richard Hamming included as a part of this SO question and the speech had a quote that got me thinking about when someone should move into development. When your vision of what you want to do is what you can do single-handedly, then you should pursue it. The day your vision, what you think needs to be done, is bigger than what you can do single-handedly, then you have to move toward management. And the bigger the vision is, the farther in management you have to go. Any other suggestions as to how you can decide if you want to move away from the coal face and into management?

    Read the article

  • Five Key Strategies in Master Data Management

    - by david.butler(at)oracle.com
    Here is a very interesting Profit Magazine article on MDM: A recent customer survey reveals the deleterious effects of data fragmentation. by Trevor Naidoo, December 2010   Across industries and geographies, IT organizations have grown in complexity, whether due to mergers and acquisitions, or decentralized systems supporting functional or departmental requirements. With systems architected over time to support unique, one-off process needs, they are becoming costly to maintain, and the Internet has only further added to the complexity. Data fragmentation has become a key inhibitor in delivering flexible, user-friendly systems. The Oracle Insight team conducted a survey assessing customers' master data management (MDM) capabilities over the past two years to get a sense of where they are in terms of their capabilities. The responses, by 27 respondents from six different industries, reveal five key areas in which customers need to improve their data management in order to get better financial results. 1. Less than 15 percent of organizations surveyed understand the sources and quality of their master data, and have a roadmap to address missing data domains. Examples of the types of master data domains referred to are customer, supplier, product, financial and site. Many organizations have multiple sources of master data with varying degrees of data quality in each source -- customer data stored in the customer relationship management system is inconsistent with customer data stored in the order management system. Imagine not knowing how many places you stored your customer information, and whether a customer's address was the most up to date in each source. In fact, more than 55 percent of the respondents in the survey manage their data quality on an ad-hoc basis. It is important for organizations to document their inventory of data sources and then profile these data sources to ensure that there is a consistent definition of key data entities throughout the organization. Some questions to ask are: How do we define a customer? What is a product? How do we define a site? The goal is to strive for one common repository for master data that acts as a cross reference for all other sources and ensures consistent, high-quality master data throughout the organization. 2. Only 18 percent of respondents have an enterprise data management strategy to ensure that data is treated as an asset to the organization. Most respondents handle data at the department or functional level and do not have an enterprise view of their master data. The sales department may track all their interactions with customers as they move through the sales cycle, the service department is tracking their interactions with the same customers independently, and the finance department also has a different perspective on the same customer. The salesperson may not be aware that the customer she is trying to sell to is experiencing issues with existing products purchased, or that the customer is behind on previous invoices. The lack of a data strategy makes it difficult for business users to turn data into information via reports. Without the key building blocks in place, it is difficult to create key linkages between customer, product, site, supplier and financial data. These linkages make it possible to understand patterns. A well-defined data management strategy is aligned to the business strategy and helps create the governance needed to ensure that data stewardship is in place and data integrity is intact. 3. Almost 60 percent of respondents have no strategy to integrate data across operational applications. Many respondents have several disparate sources of data with no strategy to keep them in sync with each other. Even though there is no clear strategy to integrate the data (see #2 above), the data needs to be synced and cross-referenced to keep the business processes running. About 55 percent of respondents said they perform this integration on an ad hoc basis, and in many cases, it is done manually with the help of Microsoft Excel spreadsheets. For example, a salesperson needs a report on global sales for a specific product, but the product has different product numbers in different countries. Typically, an analyst will pull all the data into Excel, manually create a cross reference for that product, and then aggregate the sales. The exact same procedure has to be followed if the same report is needed the following month. A well-defined consolidation strategy will ensure that a central cross-reference is maintained with updates in any one application being propagated to all the other systems, so that data is synchronized and up to date. This can be done in real time or in batch mode using integration technology. 4. Approximately 50 percent of respondents spend manual efforts cleansing and normalizing data. Information stored in various systems usually follows different standards and formats, making it difficult to match the data. A customer's address can be stored in different ways using a variety of abbreviations -- for example, "av" or "ave" for avenue. Similarly, a product's attributes can be stored in a number of different ways; for example, a size attribute can be stored in inches and can also be entered as "'' ". These types of variations make it difficult to match up data from different sources. Today, most customers rely on manual, heroic efforts to match, cleanse, and de-duplicate data -- clearly not a scalable, sustainable model. To solve this challenge, organizations need the ability to standardize data for customers, products, sites, suppliers and financial accounts; however, less than 10 percent of respondents have technology in place to automatically resolve duplicates. It is no wonder, therefore, that we get communications about products we don't own, at addresses we don't reside, and using channels (like direct mail) we don't like. An all-too-common example of a potential challenge follows: Customers end up receiving duplicate communications, which not only impacts customer satisfaction, but also incurs additional mailing costs. Cleansing, normalizing, and standardizing data will help address most of these issues. 5. Only 10 percent of respondents have the ability to share data that was mastered in a master data hub. Close to 60 percent of respondents have efforts in place that profile, standardize and cleanse data manually, and the output of these efforts are stored in spreadsheets in various parts of the organization. This valuable information is not easily shared with the rest of the organization and, more importantly, this enriched information cannot be sent back to the source systems so that the data is fixed at the source. A key benefit of a master data management strategy is not only to clean the data, but to also share the data back to the source systems as well as other systems that need the information. Aside from the source systems, another key beneficiary of this data is the business intelligence system. Having clean master data as input to business intelligence systems provides more accurate and enhanced reporting.  Characteristics of Stellar MDM When deciding on the right master data management technology, organizations should look for solutions that have four main characteristics: enterprise-grade MDM performance complete technology that can be rapidly deployed and addresses multiple business issues end-to-end MDM process management with data quality monitoring and assurance pre-built MDM business relevant applications with data stores and workflows These master data management capabilities will aid in moving closer to a best-practice maturity level, delivering tremendous efficiencies and savings as well as revenue growth opportunities as a result of better understanding your customers.  Trevor Naidoo is a senior director in Industry Strategy and Insight at Oracle. 

    Read the article

  • Configuration management in support of scientific computing

    - by Sharpie
    For the past few years I have been involved with developing and maintaining a system for forecasting near-shore waves. Our team has just received a significant grant for further development and as a result we are taking the opportunity to refactor many components of the old system. We will also be receiving a new server to run the model and so I am taking this opportunity to consider how we set up the system. Basically, the steps that need to happen are: Some standard packages and libraries such as compilers and databases need to be downloaded and installed. Some custom scientific models need to be downloaded and compiled from source as they are not commonly provided as packages. New users need to be created to manage the databases and run the models. A suite of scripts that manage model-database interaction needs to be checked out from source code control and installed. Crontabs need to be set up to run the scripts at regular intervals in order to generate forecasts. I have been pondering applying tools such as Puppet, Capistrano or Fabric to automate the above steps. It seems perfectly possible to implement most of the above functionality except there are a couple usage cases that I am wondering about: During my preliminary research, I have found few examples and little discussion on how to use these systems to abstract and automate the process of building custom components from source. We may have to deploy on machines that are isolated from the Internet- i.e. all configuration and set up files will have to come in on a USB key that can be inserted into a terminal that can connect to the server that will run the models. I see this as an opportunity to learn a new tool that will help me automate my workflow, but I am unsure which tool I should start with. If any member of the community could suggest a tool that would support the above workflow and the issues specific to scientific computing, I would be very grateful. Our production server will be running Linux, but support for OS X would be a bonus as it would allow the development team to setup test installations outside of VirtualBox.

    Read the article

  • Configuration management in support of scientific computing

    - by Sharpie
    For the past few years I have been involved with developing and maintaining a system for forecasting near-shore waves. Our team has just received a significant grant for further development and as a result we are taking the opportunity to refactor many components of the old system. We will also be receiving a new server to run the model and so I am taking this opportunity to consider how we set up the system. Basically, the steps that need to happen are: Some standard packages and libraries such as compilers and databases need to be downloaded and installed. Some custom scientific models need to be downloaded and compiled from source as they are not commonly provided as packages. New users need to be created to manage the databases and run the models. A suite of scripts that manage model-database interaction needs to be checked out from source code control and installed. Crontabs need to be set up to run the scripts at regular intervals in order to generate forecasts. I have been pondering applying tools such as Puppet, Capistrano or Fabric to automate the above steps. It seems perfectly possible to implement most of the above functionality except there are a couple usage cases that I am wondering about: During my preliminary research, I have found few examples and little discussion on how to use these systems to abstract and automate the process of building custom components from source. We may have to deploy on machines that are isolated from the Internet- i.e. all configuration and set up files will have to come in on a USB key that can be inserted into a terminal that can connect to the server that will run the models. I see this as an opportunity to learn a new tool that will help me automate my workflow, but I am unsure which tool I should start with. If any member of the community could suggest a tool that would support the above workflow and the issues specific to scientific computing, I would be very grateful. Our production server will be running Linux, but support for OS X would be a bonus as it would allow the development team to setup test installations outside of VirtualBox.

    Read the article

  • Tokyo Tyrant ulog / update log management.

    - by Nathan Milford
    I'm testing Tokyo Tyrant in a master-master setup and have found the ulog grows out of control and locks up the disk. At first I found the -ulim option useful and limited the logfile size, however it simply rolls over to a new log, leaving the old ones to clutter up the partition. I suppose I'll write a shell script that will delete ulogs older than X, once I find out how far back Tokyo Tyrant needs in the update log in order to failover. Does anyone have any experience with this Tokyo Tyrant? Do you have a feel (acknowledging that every install is different based on what is being stored) for the optimal ulog size vs how far back a Tokyo Tyrant instance needs to look in the ulog to assume master status? Thanks, nathan

    Read the article

  • SNMP based network discovery (switches), device (ports on switches) power management

    - by SaM
    In a enterprise network, what would be the right way to generate a list of switches (SNMP managed) Is it reasonable to ask the organization to supply a list such as this: Switch name IP Address of switch Location SNMP community strings Or are there standard ways to run discovery scans - UDP broadcasts? After having generated a repository such as the above; given a single switch, how to query it for the list of all devices attached to it? Finally, how to selectively power down/power up ports? (remotely - using SNMP) Platform is going to be .NET based (C#) and the library being used is SharpSNMP

    Read the article

  • Network Management Cable Labeling Techniques and their alternatives [closed]

    - by Alex
    Possible Duplicate: What is the most effective solution you used to label cables? Yes i know there are a lot of howtos and already answered questions about this topic, like this one: How do you organise the cables in your racks? Currently i am searching the web for different techniques (alternatives) for labeling the cables at the server racks and/or data centers. Unfortunately i do not have any experience with labeling/documentation of network cables in a large scale. As far as I could lookup by now the current labeling techniques are coloring and a self defined print-labeling technique (numbering, text) maybe also according to a standard which are usually used. I want to know if QR, RFID (ok RFID in a data center would be stupid due to the radio frequency wouldn't it be?), Barcodes or similar (??) have already been used by some administrators or why they did not consider such techniques at all? Too complicated (with QR scanner etc..) if you are in front of the cables and want to get quick feedback for what the cable is? What alternatives are out there? Advantages/Disadvantages? Best-Practice? I would appreciate any help on this topic, thank you! Regards, Alex

    Read the article

  • Digital Asset Management, iPhoto / Aperture server... alternative

    - by Sisyphus
    Afternoon, Clients, 10 : All Apples running either Leopard or Snow Leopard Server : Snow Leopard server, (and I have a old Dell Poweredge 650 at home running Gentoo 2.6, if anybody as a Linux solution). The situation: I work in small design company with 8 people, at present we are looking to consolidate all our image files onto one location, at present we each use our preferred single user DAM solution, be it, Adobe Bridge, iPhoto/Aperture (some don't bother at all) The filetypes commonly used are .psd, .pdf, .eps, .tiff, .jpg and RAW image files. Ideally what is needed: Centralised on one server, but allows us to search via spotlight (not essential, but would be nice) Include searchable metadata information such as date, location, title Open-source or as low cost as possibly Allow simultaneous users to import files So far, I have looked at a few open source DAM, systems, such as Razuna, Gallery (not strictly DAM), ResourceSpace, Notre-DAM, while these are brilliant and open-source, they don't integrate as smoothly with the Desktop as iPhoto and aperture. For iPhoto and aperture, I have tried creating a Shared library on the server (a tad laggy), and also using a drive with no permissions, put a library and letting each client read from it, however if they want to put images onto the library only, it's only supports one user at a time writing to the library... Any ideas what could fulfill our needs? Or is it time to bite the bullet for FinalCut Server? Thanks in advance.

    Read the article

  • Change Management Software

    - by Andrew
    I manage an 80,000 user CIS application written in Uniface. Every form in the application, and many of its processes, are represented by .frm files. We have hundreds of these files and 5 instances of the application. Instances include multiple production installations which must be kept sync'd. We do not get MD5 from our vendor for files that are released to us as patches. We have been using a spreadsheet to track changes, but this is far from ideal. Is there a commercial application that can be purchased that will allow us to track changes to the instances? Thank you all! EDIT: Patches are released as zip files with either FRM files in them or SQL files or a mix of both. SQL files will contain statements that need to be run in Oracle. Patches are also assigned unique patch numbers.

    Read the article

  • Unix Password Management Keyring

    - by Phil
    I am looking for a password manager for a command-line Unix environment. So far all I can find are keyring applications for Windows, Linux, and Mac. But no command-line Unix interfaces. My main goal is to be able to access a password keyring through an SSH connection to a machine that has no graphical user interface. If there are no good unix password keyrings out there, what would be a better way to store personal passwords in a central location?

    Read the article

  • Server Room Protocols/Server Room Management

    - by Matthew E
    Hi, I'm new to this site but have found the articles and feedback very useful. We have a Server Room which our Organisation owns and controls, yet there are several thirs party companies that have open access to this room. As such, we have been asked to put together a protocol paper that stipulates the standards that we expect to be adhered to when working in this room. Other than the monitoring of UPS loads, Air Cooling functionality, alarm systems etc, does any one have any guidance on the kind of issues that need to be documented to make this protocol all encompassing? I'm thinking along the lines of not leaving cardboard or other combustibles in the room, not having food and drink in the room, not altering the fabric of the building by drilling through walls etc? Many thanks in advance for any guidance provided.

    Read the article

  • IT Asset Management

    - by CogitoErgoSum
    Our company has grown quite quickly and I am facing new tasks which I did not think I'd need to deal with. Recently we've come ot a point where we have 100+ Devices (Routers, Bridges, Computers, Laptops, VOIP Phones etc). The other day I was quite frightened when I asked for an inventory and no one had one. I want to start tagging all equipment and recording serials to begin tracking our inventory and ensuring we have a proper record of what equipment we have. Does anyone have advice as to how to go about 1. Convincing the higher ups why we need to do this and 2. What software or strategies might work? Keep in mind this is not for furniture, office equipment etc but IT specific equipment. I'm concerned over people 1. Stealing the physical devices and 2. Losing track of configuration data etc in case we'd need to do a wipe and restore

    Read the article

  • download management

    - by Jonathan
    I download many files, usually 2 or 3 a day, often 10ish. Some of them are duplicates because I just can't be bothered to find the original in my downloads folder. I have previously tried DAP and used that to create a new subfolder for each day's download. yet I have found this insufficient as sometimes I wish to find files by name/file type or I have multiple parts of downloads over more than one day. Another problem I have found is zips/rars/etc after downloading them and extracting them I then have the zip and the folder. I like it like on a Mac where it automatically extracts the zip after it has been downloaded and removes the zip. What I'd like to be able to do is sort the downloads by date, but dynamically so they are just in the big downloads folder, but I can just press a button and it will show me all the files from a particular site, or from a particular day or by a certain file type. Is there any software that will do this? I use Chrome as a browser but also have Firefox and like that. Jonathan

    Read the article

  • Best photo management software?

    - by Niels Basjes
    Hi, What I would like is a single piece of software (or a smart combination of tools) that allow me to manage my photos in a better way than what I've found so far. 1. Tags Primarily I need a way of tagging the images. So I can manually tag photos the same way we tag questions here at SO/SF/SU. I want this software to place a lot of the tags automagically (obvious things like date and resolution). 2. Face recognition What I would really like is that this software has a feature that it can recognize faces in images and places tags with the name of the person. So far I've only heard of one online photo system that can do that (Picasa) and not yet of any offline tool. 3. Version database I must have some way of having a central GIT/SVN/... that contains all images. I have had a harddrive corruption a few years ago and it took me a long time to figure out which images had been damaged. I always want to be able to go back to what the camera produced. 4. Website I want to be able to generate a website (few 'tag' specific websites) based on the actual content. 5. Easy bulk uploading Many photo tools have a one on one uploading option. I prefer simply 'throwing' my images on a file server under Linux (Samba) and let the system automagically integrate, tag, recognize, etc. all images. Ok, I know these are a bit much. Perhaps you guy's have some suggestions about existing tools that can make this possible. Or even a complete system that does this. EDIT: To clarify on the OS. I prefer Linux for any 'server' task and Windows XP for any 'desktop' task. Thanks for all your input. Niels Basjes

    Read the article

  • Unless I start Management Studio with "Run as administrator", I get a Login Failure (Error 18456)

    - by MedicineMan
    I cannot connect to the SQL Server instance if I do not start management studio as a administrator. I am running windows 7, SQL Server 2008, and Management Studio 10.0. If I run as a normal user, the error I get is: Cannot connect to .. Additional information: login failed for user 'COMPUTERNAME\MyUserName'. (Microsoft SQL Server, Error 18456) for server name I have tried the following: . localhost COMPUTERNAME

    Read the article

  • Application for time and projet management

    - by user10826
    I want to improve the way I organize my projects/tasks/schedule What I do now is: keep an excel sheet with the name of the most important tasks/projects, I look at it at the beginning of each day and decide the ones I will focus on on iCal I write down events for each day, or for a concrete time (13 to 14 hours). I set up each day the tasks I want to accomlish, and allocate them hours I use Things (culture code) to keep info about tasks and projects not very important and which are not time allocated yet (GTD name = someday) I use Mail on Mac and create folders for the mails I want to process with the name of the different projects I save the main info for each project on freemind maps My system works well at the moment but it is pretty complicated to use. I want to make it better and I am looking for something with these requirements: must be 100% offline accessable it should use as less programs/resources as possible, ideally just one program should be able to manage all my info I can use the GTD methodology mixed with priorities and I can allocate each task converted to event on my calendar I can have different daily/weekly, etc views on a calendar to see the "big picture" must run on mac os x leopard price does not matter, I will pay for this So, according to your experience, can you recommend me something like this? Thanks

    Read the article

  • WEB based HPC cluster node management

    - by Skuja
    Hello, i am working on my school diploma thesis. The main goal is to create web based application where logged users could see free and busy nodes, turn them on and off, see what process they are running etc. Figured out that i could do something like this - write some cron daemon that would run every 30seconds or so, and it could run ping utility for each node to find out if it is on or off, then write results to some file. Then from my web app (i will write in PHP) i could read the info. Will it be a good solution? How would you suggest me to do it? And finally, is there any existing solutions (it may not be a definetly ewb based) for managment of cluster nodes?

    Read the article

  • Hyper-V management remotely

    - by Péter
    I'll tell you in advance that I'm newbie in the topic. I have a Win8 (Home) machine with Hyper-V installed behind a router. The router has a public IP and a domain attached. I have another Win8 (Work) machine also installed Hyper-V. I want to access to my home Hyper-V via Hyper-V Manager so I can manage my virtual machines from work. I found this article but I don't know if it's applicable to me. I thought that a simple port forwarding should work and I only need to do is grant the Work HV manager my domain and the port I choose and if it's pop a login form I only need to fill the user data of my Home computer? How can I solve this? My thoughts revolve around: - Port forwarding - set domain+port and set my home user - Set up a VPN and use the local ip address of my home computer (it looks like a little cumbersome and my router only support PPTP) I'm open to any other solution too. Thanks, Péter

    Read the article

  • Application for time and project management

    - by user10826
    I want to improve the way I organize my projects/tasks/schedule What I do now is: keep an excel sheet with the name of the most important tasks/projects, I look at it at the beginning of each day and decide the ones I will focus on on iCal I write down events for each day, or for a concrete time (13 to 14 hours). I set up each day the tasks I want to accomlish, and allocate them hours I use Things (culture code) to keep info about tasks and projects not very important and which are not time allocated yet (GTD name = someday) I use Mail on Mac and create folders for the mails I want to process with the name of the different projects I save the main info for each project on freemind maps My system works well at the moment but it is pretty complicated to use. I want to make it better and I am looking for something with these requirements: must be 100% offline accessable it should use as less programs/resources as possible, ideally just one program should be able to manage all my info I can use the GTD methodology mixed with priorities and I can allocate each task converted to event on my calendar I can have different daily/weekly, etc views on a calendar to see the "big picture" must run on mac os x leopard price does not matter, I will pay for this So, according to your experience, can you recommend me something like this? Thanks

    Read the article

  • Cable management between multiple racks

    - by RippieUK
    I am in the planning phase of re-cabling our 5 racks in one of our offices and I would like to ask for some guidance on how you go, or would go about managing cables that goes between racks. In our situation we have 5 racks where the furthest to the right is our main patch panel for 300 floor ports. The rack next to it is our main comms rack where main switches and ISP routers are located. the other 3 racks next to the comms rack then all need to connect back to the main comms rack. I am not sure if a 48 port patch panel in each rack would be any good for this scenario? mainly because i am not sure this can be linked back to the main switch with only 1 cable. Would a 48 port switch in each rack be better as you can uplink those back to main switch? Or should we just run cables between racks back to the main switch? Hope someone can offer some guidiance. Thx

    Read the article

  • How to display configurable product in each color in product listing?

    - by Thomas
    I have a configurable product which is available in many different colors and sizes. I want the configurable product to appear once for every color. My idea is to assign one simple product of the configurable product in every color to the category of the configurable product. Then I want to change the listing, so that the (colored) simple product links to it's master product (the configurable one). The other way would be, to just assign the configurable product to a category and then list it multiple times with different colors. But I think this would be to complicated.

    Read the article

  • How to display configurable product in each color in Magento product listing?

    - by Thomas
    I have a configurable product which is available in many different colors and sizes. I want the configurable product to appear once for every color. My idea is to assign one simple product of the configurable product in every color to the category of the configurable product. Then I want to change the listing, so that the (colored) simple product links to it's master product (the configurable one). The other way would be, to just assign the configurable product to a category and then list it multiple times with different colors. But I think this would be to complicated.

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >