Search Results

Search found 22625 results on 905 pages for 'must do better'.

Page 428/905 | < Previous Page | 424 425 426 427 428 429 430 431 432 433 434 435  | Next Page >

  • The Talent Behind Customer Experience

    - by Christina McKeon
    Earlier, I wrote about Powerful Data Lessons from the Presidential Election. A key component of the Obama team’s data analysis deserves its own discussion—the people. Recruiters are probably scrambling to find out who those Obama data crunchers are and lure them into corporations. For the Obama team, these data scientists became a secret ingredient that the competition didn’t have. This team of analysts knew how to hear the signal and ignore the noise, how to segment and target its base, and how to model scenarios and revise plans based on what the data told them. The talent was the difference. As you work to transform your organization to be more customer-centric, don’t forget that talent is a critical element. Journey mapping is a good start to understanding how your talent impacts your customer experiences. Part of journey mapping includes documenting the “on-stage” and “back-stage” systems and touchpoints. When mapping this part of your customers’ journey, include the roles and talent behind the employee actions—both customer facing and further upstream from that customer touchpoint. Know what each of these roles does, how well you are retaining people in these areas, and your plans to fill these open positions in the future. To use data scientists as an example, this job will be in high demand over the next 10 years. The workforce is shrinking, and higher education institutions may not be able to turn out trained data scientists as fast as you need them. You don’t want to be caught with a skills deficit, so consider how you can best plan for the future talent you will need. Have your existing employees make their career aspirations known to you now. You may find you already have employees willing to take on roles that drive better customer experiences. Then develop customer experience talent from within your organization through targeted learning programs. If you know that you will need to go outside the organization, build those candidate relationships now. Nurture the candidates you want to hire and partner with universities, colleges, and trade associations so you can increase the number of qualified candidates in your talent pool.

    Read the article

  • System Restore Points

    - by Ross Lordon
    Currently I am investigating how to schedule an automatic initiation of a system restore point for all of the workstations in my office. It seems that Task Scheduler already has some nice defaults (screenshots below). Even the history for this task verifies that it is running successfully. However, when I go to Recovery in the Control Panel it only lists the System Restore Points one for every previous for only 3 weeks back even if I check the show more restore points box. Why don't the additional ones appear? Would there be a better way to implement a solution, like via group policy or a script? Is there any documentation on this online? I've been having a hard time tracking anything down except how to subvert group policy disabling this feature.

    Read the article

  • Storing data offline with javascript

    - by Walker
    My question is about storing data offline and potentially whether I will need to bring in an outside programmer or could this be learned within a few weeks? The website I am working on will have an interface where users will login and go through a series of quizzes in the form of checkbox, drop down menus, and others. Each page/quiz area could have 20-100 total checkboxes in a series of 3-5 rows because of the comprehensive nature of course. This I can do - I know how to code the quiz and return a correct or incorrect answer based on each individual checkbox and present a cumulative score (ie: you got 57% correct). The issue lies in the fact that I would like to save the users results and keep them informed of their progress. When they complete all of the quizzes, I would like to have a visual output of their performance in each area. Storing the output from their results offline is where I think I may run into a problem with my lack of coding experience. I would also like to have a sidebar with their progress of each section (10-15) with a green percentage completion bar or a % correct which would draw from this. I have never had to code something that stores information like this offline - so back to my question - would it be better to learn the language needed or bring in a coder/developer for the back end stuff.

    Read the article

  • Collision with half of a semi-circle

    - by heitortsergent
    I am trying to port a game I made using Flash/AS3, to the Windows Phone using C#/XNA 4.0. You can see it here: http://goo.gl/gzFiE In the flash version I used a pixel-perfect collision between meteors (it's a rectangle, and usually rotated) that spawn outside the screen, and move towards the center, and a shield in the center of the screen(which is half of a semi-circle, also rotated by the player), which made the meteor bounce back in the opposite direction it came from, when they collided. My goal now is to make the meteors bounce in different angles, depending on the position it collides with the shield (much like Pong, hitting the borders causes a change in the ball's angle). So, these are the 3 options I thought of: Pixel-perfect collision (Microsoft has a sample) , but then I wouldn't know how to change the meteor angle after the collision 3 BoundingCircle's to represent the half semi-circle shield, but then I would have to somehow move them as I rotate the shield. Farseer Physics. I could make a shape composed of 3 lines, and use that as the collision object for the shield. Is there any other way besides those? Which would be the best way to do it(it's aimed towards mobile devices, so pixel-perfect is probably not a good choice)? Most of the time there's always a easier/better way than what we think of...

    Read the article

  • Are injectable classes allowed to have constructor parameters in DI?

    - by Songo
    Given the following code: class ClientClass{ public function print(){ //some code to calculate $inputString $parser= new Parser($inputString); $result= $parser->parse(); } } class Parser{ private $inputString; public __construct($inputString){ $this->inputString=$inputString; } public function parse(){ //some code } } Now the ClientClass has dependency on class Parser. However, if I wanted to use Dependency Injection for unit testing it would cause a problem because now I can't send the input string to the parser constructor like before as its calculated inside ClientCalss itself: class ClientClass{ private $parser; public __construct(Parser $parser){ $this->parser=$parser; } public function print(){ //some code to calculate $inputString $result= $this->parser->parse(); //--> will throw an exception since no string was provided } } The only solution I found was to modify all my classes that took parameters in their constructors to utilize Setters instead (example: setInputString()). However, I think there might be a better solution than this because sometimes modifying existing classes can cause much harm than benefit. So, Are injectable classes not allowed to have input parameters? If a class must take input parameters in its constructor, what would be the way to inject it properly? UPDATE Just for clarification, the problem happens when in my production code I decide to do this: $clientClass= new ClientClass(new Parser($inputString));//--->I have no way to predict $inputString as it is calculated inside `ClientClass` itself. UPDATE 2 Again for clarification, I'm trying to find a general solution to the problem not for this example code only because some of my classes have 2, 3 or 4 parameters in their constructors not only one.

    Read the article

  • Using a AWS EC2 Server to host a busy website and I need to set up a loadbalancing

    - by Philip Isaacs
    My company has one EC2 server running on AWS with a MYSQL-DB and Apache on the same instance. This one instance hosts a website built on PHP Zend Framework. The site runs like crap when it starts to get busy with a lot of traffic so I'm looking for some advice on how to set up something that can handle the load better. My first question is should I move the mysql DB on to a separate EC2 instance or perhaps use AWS's RDS service which looks like a nice option. I'm sort of new to some of this but I'm guessing I'll need at least two EC2 instances for serving the website from and some sort of load balancing mechanism to distribute traffic. But maybe not, I'm not sure. Also what are some best practices for how to replicate the data so that they stay in sync on both instances? Okay I know these are a lot of questions. But I don't know where to start so any advice will help.

    Read the article

  • How the OS Makes the Database Scream

    - by rickramsey
    source Few things are as satisfying as a screaming burnout. When Oracle Database engineers team up with Solaris engineers, they do a lot of them. Here are a few of the reasons why. Article: How the OS Makes the Database Fast - Oracle Solaris For applications that rely on Oracle Database, a high-performance operating system translates into faster transactions, better scalability to support more users, and the ability to support larger capacity databases. When deployed in virtualized environments, multiple Oracle Database servers can be consolidated on the same physical server. Ginny Henningsen describes what Oracle Solaris does to make the Oracle database run faster. Interview: Why Is The OS Still Relevant? In a world of increasing virtualization and growing interest in cloud services, why is the OS still relevant? Michael Palmeter, senior director of Oracle Solaris, explains why it's not only relevant, but essential for data centers that care about performance. Interview: An Engineer's Perspective: Why the OS Is Still Relevant Sysadmins are handling hundreds or perhaps thousands of VM's. What is it about Solaris that makes it such a good platform for managing those VM's? Liane Praza, senior engineer in the Solaris core engineering group provides an engineer's perspective. Interview in the Lab: How to Get the Performance Promised by Oracle's T5 SPARC Chips If you want your applications to run on the new SPARC T5/M5 chips, how do you make sure they use all that new performance? Don Kretsch, Senior Director of Engineering, explains. Interview: Why Oracle Database Engineering Uses Oracle Solaris Studio The design priorities for Oracle Solaris Studio are performance, observability, and productivity. Why this is good for ISV's and developers, and why it's so important to the Oracle database engineering team. Taped in Oct 2012. - Rick Follow me on: Blog | Facebook | Twitter | YouTube | The Great Peruvian Novel

    Read the article

  • How do I properly dual boot Ubuntu and Windows 7 on an UEFI system?

    - by S. Robert James
    I have an HP Z210 with Windows 7, which is UEFI based. I'd like to dual boot it to Ubuntu 11.10 with GRUB2, but am having problems due to the UEFI. The install CD goes to the end, but then the machine always boots right into Windows. These problems are apparently documented (here and here), but there's no consensus as to what the simplest solution is. Any recommendations? I want Ubuntu and Windows to both be bootable. (Perhaps if I knew more about UEFI and its system partition, and how it differs from BIOS and MBR, I'd be in a better position. So background answers explaining how UEFI loads up are very appreciated.)

    Read the article

  • Geekswithblogs.net | Screen Resolutions of our Readers

    - by Jeff Julian
    Yesterday I talked about the Browsers we see being used by our readers driven off of our Google Analytics traffic and today I want to share with you the Screen Resolutions we see.  As a web developer most of my life, it is hard to decide how large you should build your application because typically you have a couple huge high resolution monitors on your desk, but you typical end user is thought to have 1024x768.  With HTML5/CSS3 out, it is a little better coming up with a design that will scale to all resolutions, but it is still nice to know the numbers when it comes to how much real estate do I have on my clients. If you look at these numbers for Geekswithblogs.net, we have a lot of high resolution monitors from users that visit the site.  After a little more investigation of the number you will notice we do not have as much height available as we do width.  If the primary goal of a site is to deliver as much data in the viewable area without scrolling, this becomes a challenge when most of our pages have long pieces of formatted data.  So our challenge is to build skins that use up more of the sides of the content toward the top on larger resolution browsers and then entice the reader to scroll to get the goodies embedded in the content of the posts.  Going to be an interesting battle for sure, but we really need more skin offerings on the site. Technorati Tags: Resolution Statistics,Geekswithblogs.net

    Read the article

  • How do I catalog files on several external hard drives that I want to store off-line? OSX

    - by raudi
    My partner, an artist, has more than 10 external hardisks both USB and firewire and every 2-3 months a new one has to be added (She's working with videos and pictures) currently its 10TB and growing so too much for a affordable NAS. Right now the files are not indexed and I think can not be searched with spotlight because not all drives can be connected at the same time. So if she wants to search for a file, she has to guess which disk/disks (based mostly on the date) and then search several drives. Now I'm looking for a solution to index/catalog the drives, something like GentibusCD Cathy Disclib (all these solutions are unfortunately Windows only) Is there any software for OSX that will catalog all the hard drives, so she can search the catalog, find the files, and get the ID of the disk / disk name that has the content? Preferably something with a GUI so my partner can also use it easily Preferably with Thumbnails for pictures/videos (But even an equivalent to "tree /F /A" would be better than nothing)

    Read the article

  • Entity system in Lua, communication with C++ and level editor. Need advice.

    - by Notbad
    Hi!, I know this is a really difficult subject. I have been reading a lot this days about Entity systems, etc... And now I'm ready to ask some questions (if you don't mind answering them) because I'm really messed. First of all, I have a 2D basic editor written in Qt, and I'm in the process of adding entitiy edition. I want the editor to be able to receive RTTI information from entities to change properties, create some logic being able to link published events to published actions (Ex:A level activate event throws a door open action), etc... Because all of this I guess my entity system should be written in scripting, in my case Lua. In the other hand I want to use a component based design for my entities, and here starts my questions: 1) Should I define my componentes en C++? If I do this en C++ won't I loose all the RTTI information I want for my editor?. In the other hand, I use box2d for physics, if I define all my components in script won't it be a lot of work to expose third party libs to lua? 2) Where should I place the messa system for my game engine? Lua? C++?. I'm tempted to just have C++ object to behave as servers, offering services to lua business logic. Things like physics system, rendering system, input system, World class, etc... And for all the other things, lua. Creation/Composition of entities based on components, game logic, etc... Could anyone give any insight on how to accomplish this? And what aproach is better?. Thanks in advance, HexDump.

    Read the article

  • What version of Windows Server 2008 to Get

    - by dragonmantank
    One of the projects I'm working on is looking like we're going to need to migrate from CentOS 5.4 over to something else (we need to run Postgresql 8.3+, and CentOS/RHEL only support 8.1), and one of the options will be Windows Server. Since 2008 R2 is out that's what I'm looking at. I'll need to run Postgres and Tomcat and don't really require anything that Windows has like IIS (if I can run Server Core, even better!). The other kicker is it will be virtualized through VMWare ESXI 4.0 so that we have three separate boxes: development, Quality, and Production servers. From a licensing standpoint though, and I good enough with just the Web Server edition? Am I right in assuming that will be three licenses? Or should I just jump up to Enterprise so that I get 4 VM licenses?

    Read the article

  • Fusion HCM in Boots

    - by Kristin Rose
    These boots are made for walking, and that’s just what they’ll do…Of course by boots, we’re referring to Oracle’s HCM Boot Camps for OPN members, which offer a hands-on approach to learning about Oracle Fusion HCM and Taleo positioning and capabilities. Those who attend an Oracle HCM boot camp will be prepared to achieve Oracle Fusion HCM Presales Specialist status, discuss Oracle Fusion HCM with customers to build pipeline, and complete competency criteria toward Oracle Fusion HCM 11g Specialization! This in-person event offers expert-led sessions, discussion, and hands-on activities meaning you will get the information quicker and remember it better! Plus, we think a free lunch is always a good thing. As a next step, all interested partners should: Obtain self-service knowledge from the Oracle Fusion Human Capital Management 11g PreSales Specialist Guided Learning Path. Become a Specialist by completing the Oracle Fusion Human Capital Management 11g PreSales Specialist Assessment . Contact their regional Oracle Alliances & Channels point-of-contact to learn more about these free OPN Boot Camp events, and the opportunity to attend the next one. We know you’ll be strutting your stuff after you've gained the knowledge and expertise to become Oracle Fusion HCM Specialized! Check it out! The OPN Communications Team 

    Read the article

  • Configuration tools for multiple monitors for X / Linux

    - by richard
    I have Ubuntu 10.04 running gnome and two monitors. I am wondering if a can get a better multi-monitor configuration tool. The one I have, gnome-display-properties, has too many problems, including: when I swapped my monitors over, the narrower one now on the left. There is a width calculation error, such that I have a virtual monitor the width of the wide-monitor on the narrow-monitor and part of the wide monitor. And a virtual narrow-monitor on the remainder of the wide-monitor. I would like: nobugs. to be able to select which is primary monitor. to have multiple configurations. configurations to be automatically selected based on which monitors are attached. configurations to be cycled (reliably) when display mode key is pressed. when a display is deactivated, for windows to migrate to remaining monitors. option to not change display resolution when mirroring, but to use side/top blanking bars to pad out screen.

    Read the article

  • Disaster Recovery Example

    Previously, I use to work for a small internet company that sells dental plans online. Our primary focus concerning disaster prevention and recovery is on our corporate website and private intranet site. We had a multiphase disaster recovery plan that includes data redundancy, load balancing, and off-site monitoring. Data redundancy is a key aspect of our disaster recovery plan. The first phase of this is to replicate our data to multiple database servers and schedule daily backups of the databases that are stored off site. The next phase is the file replication of data amongst our web servers that are also backed up daily by our collocation. In addition to the files located on the server, files are also stored locally on development machines, and again backed up using version control software. Load balancing is another key aspect of our disaster recovery plan. Load balancing offers many benefits for our system, better performance, load distribution and increased availability. With our servers behind a load balancer our system has the ability to accept multiple requests simultaneously because the load is split between multiple servers. Plus if one server is slow or experiencing a failure the traffic is diverted amongst the other servers connected to the load balancer allowing the server to get back online. The final key to our disaster recovery plan is off-site monitoring that notifies all IT staff of any outages or errors on the main website encountered by the monitor. Messages are sent by email, voicemail, and SMS. According to Disasterrecovery.org, disaster recovery planning is the way companies successfully manage crises with minimal cost and effort and maximum speed compared to others that are forced to make decision out of desperation when disasters occur. In addition Sun Guard stated in 2009 that the first step in disaster recovery planning is to analyze company risks and factor in fixed costs for things like hardware, software, staffing and utilities, as well as indirect costs, such as floor space, power protection, physical and information security, and management. Also availability requirements need to be determined per application and system as well as the strategies for recovery.

    Read the article

  • Sharing password-protected videos on social media

    - by PaulJ
    We are developing a site where users will be able to watch and download videos that they've recorded of themselves in a public event. The videos will be password protected, and will be available only to users who have paid for them at the event... ...But on the other hand, we also want users to share those videos on social media, since they will be an attractive publicity for our events. Having people log into our site with their password, download the video and then re-upload it to Youtube/Facebook will be too cumbersome, and I suspect that few users will be willing to do that. So the obvious alternative is to have one of those convenient "share" buttons, but the problem with that approach will be that: The video will be physically hosted (and linked to) in our site. What happens if those videos go viral and our bandwidth cost explodes? The video is password protected. The solution I've thought of for this is: Upload the user's video to our (password-protected site) and to Youtube at the same time, as an unlisted video. The user can access our site with his password and download his video (to watch on his TV or whatever). If the users hits the "share" button, we show him the Youtube link... and we turn the video into a listed one. This seems in line with the ideas in Using YouTube as a CDN, and there didn't seem to be any objections in that question. I'm posting this just to confirm that my idea doesn't violate any Youtube TOS, and also to see if it is a good one or there might be better alternatives.

    Read the article

  • Open Source Visualization and Dashboard Software

    - by helios
    I am working on an open source Application Performance Monitoring (APM) software and looking for a visualization tool with dashboard capabilities. I came across Graphite which looks pretty good but wondering if there is anything better out there before I settle down with that tool. Here's the list of features I am interested in: Must-Have Open Source license API to submit real time data Web-based visualization interface Persistence - file or database Nice-To-Have Dashboard Capabilities: Allow users to select a few metrics (CPU, Heap Usage, # of Active Users etc.) and place them on a single page for easier monitoring. Any suggestions?

    Read the article

  • Why does the command forfiles list files that are not a day old despite the command being otherwise?

    - by PeanutsMonkey
    The command I am executing is forfiles -p"C:\testdata" -m*.* -d-1 -c"cmd /c echo @PATH\@FILE" I have specified that I wish to only list files that are a day old however when I execute the statement, it returns me a list of files that were created today. Why is that? Am I doing something wrong? Would it be better to specify a time period as opposed to a date e.g. 24 hours? The version of forfiles I have reads as follows FORFILES v 1.1 - [email protected] - 4/98. The batch file is being run on Windows XP.

    Read the article

  • Good design for class with similar constructors

    - by RustyTheBoyRobot
    I was reading this question and thought that good points were made, but most of the solutions involved renaming one of the methods. I am refactoring some poorly written code and I've run into this situation: public class Entity { public Entity(String uniqueIdentifier, boolean isSerialNumber) { if (isSerialNumber) { this.serialNumber = uniqueIdentifier; //Lookup other data } else { this.primaryKey = uniqueIdentifier; // Lookup other data with different query } } } The obvious design flaw is that someone needed two different ways to create the object, but couldn't overload the constructor since both identifiers were of the same type (String). Thus they added a flag to differentiate. So, my question is this: when this situation arises, what are good designs for differentiating between these two ways of instantiating an object? My First Thoughts You could create two different static methods to create your object. The method names could be different. This is weak because static methods don't get inherited. You could create different objects to force the types to be different (i.e., make a PrimaryKey class and a SerialNumber class). I like this because it seems to be a better design, but it also is a pain to refactor if serialNumber is a String everywhere else.

    Read the article

  • One codebase - lots of hosted services (similar to a basecamp style service) - planning structure

    - by RickM
    We have built a service (PHP Based) for a client, and are now looking to offer it to other clients as a hosted service. For this example, think of it like a hosted forum service, where a client signs up on our site, and is given a subdomain or can use their own domain, and the code picks up the domain, checks it against a 'master' users table, and then loads the content as needed. I'm trying to work out the best way of handling multiple clients. At the moment I can only think of two options that would work: Option 1 - Have 1 set of database tables, but on each table have a column called 'siteid' - this would mean every query has to check the siteid. This would effectively work with just 1 codebase, and 1 database. Option 2 - Have 1 'master' database with all the core stuff such as the client details and their domain. Then when the systen checks the domain, it pulls the clients database details (username/password/dbname) from a table, and loads a second database. The issue here is security of the mysql server details, however it does have the benefit that they are running their own database instead of sharing one. Which option would I be better taking here, and why? Ideally I want it to be fairly easy to convert the 'standalone' script to the 'multi-domain' script as we're on a tight deadline.

    Read the article

  • New SPC2 benchmark- The 7420 KILLS it !!!

    - by user12620172
    This is pretty sweet. The new SPC2 benchmark came out last week, and the 7420 not only came in 2nd of ALL speed scores, but came in #1 for price per MBPS. Check out this table. The 7420 score of 10,704 makes it really fast, but that's not the best part. The price one would have to pay in order to beat it is ridiculous. You can go see for yourself at http://www.storageperformance.org/results/benchmark_results_spc2The only system on the whole page that beats it was over twice the price per MBPS. Very sweet for Oracle. So let's see, the 7420 is the fastest per $. The 7420 is the cheapest per MBPS. The 7420 has incredible, built-in features, management services, analytics, and protocols. It's extremely stable and as a cluster has no single point of failure. It won the Storage Magazine award for best NAS system this year. So how long will it be before it's the number 1 NAS system in the market? What are the biggest hurdles still stopping the widespread adoption of the ZFSSA? From what I see, it's three things: 1. Administrator's comfort level with older legacy systems. 2. Politics 3. Past issues with Oracle Support.   I see all of these issues crop up regularly. Number 1 just takes time and education. Number 3 takes time with our new, better, and growing support team. many of them came from Oracle and there were growing pains when they went from a straight software-model to having to also support hardware. Number 2 is tricky, but it's the job of the sales teams to break through the internal politics and help their clients see the value in oracle hardware systems. Benchmarks like this will help.

    Read the article

  • Business knowledge in a large financial org?

    - by Victor
    As a programmer working in the finance industry, I recently got a project that is a hedge fund adminsitrative application(used to calculate NAVs, allocate assets etc.) From a business point of view this is a good thing. When we think of our 'next' project, typically the impulse is to think in terms of technology. e.g: 'I want to work on a project that uses SOA/cloud etc etc.' I am interested to know if anyone while career planning also takes into account the business aspect of a future project. i.e. what the application does. So does anybody ever think like this : 'I wish to work on a trading system so I can understand capital markets better.' instead of 'I want to work on a project that uses SOA/cloud etc etc.' I say this because it appears to me in the finance domain, for senior position, good business knowledge pays well. So maybe a guy that knows more business but maybe not so much latest technologies is at an advantage? The rockstar programmer seems more suited for an aggressive startup. Particularly big old finance orgs rarely invest in tech just for the 'cool factor'. No?

    Read the article

  • Why do some software not get load balanced even when there are multiple cores?

    - by Nav
    While VTune Analyzer was running on a blade server with 8 cores, I observed the cpu useage percentage using mpstat -P ALL 1. mpstat showed me that VTune was taking up 100% of a single core, while all other cores were idle. Why does that happen? Shouldn't the OS (RHEL Server 5.2) automatically distribute load across cores? The same happened when I tried running MATLAB (even after enabling multithreading support in the MATLAB settings). p.s: I'm a developer. Not a sys admin. So felt it better to ask here rather than at serverfault.

    Read the article

  • Debian, CentOS, Slackware, FreeBSD, OpenSolaris and Ubuntu Server Edition: Which one to use for an http web server?

    - by Ako
    I am going to install and administrate a virtual server for a small university. The server should run inside a virtual machine (VirtualBox OSE). It is only used in the university network and is invisible to the outside world. It should run Apache web server for PHP, MySQL and probably a mail server. I don't know which OS to use. Main criteria for choosing include ease of administration and updating, package management and performance. I wonder if anyone has any suggestions? And candidate OSs are: Debian, Ubuntu, CentOS, Slackware, FreeBSD, OpenSolaris. Add any other OS if you know any better alternatives.

    Read the article

  • Best way to lay-out the website when sections of it are almost identical

    - by Linas
    so, I have a minisite for the mobile application that I did. The mobile application is a public transport (transit) schedule viewer for a particular city (let's call it Foo), and I'm trying to sell it via that minisite. I publish that minisite in www.myawesomeapplication.com/foo/. It has the usual "standard" subpages, like "About", "Compatible phones", "Contact", etc. Now, I have decided to create analogue mobile application for other cities, Bar and Baz. These mobile applications (products) would be almost identical to the one for the Foo city, thus the minisites for those would (should) look very similar too (except for some artwork and Foo = Bar replacement). The question is: what do you think would be the most logical way to lay-out the website in this situation, both from the business and search engine perspective? In other words, should I just duplicate the /foo/ website to /bar/ and /baz/, or would it be better to try to create a single website under root path (/)? I don't want search engine penalties for almost-duplicate information under /foo/, /bar/ and /baz/, and also I don't want a messy, non-localized website (I guess the user is more likely to buy something if he/she sees "This-and-that is the application for NYC, the city you live in", not "This-and-that is the application for city A, city B, ..., NYC, ..., and city Z.")

    Read the article

< Previous Page | 424 425 426 427 428 429 430 431 432 433 434 435  | Next Page >