Search Results

Search found 45784 results on 1832 pages for 'update site'.

Page 28/1832 | < Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >

  • Boost your infrastructure with Coherence into the Cloud

    - by Nino Guarnacci
    Authors: Nino Guarnacci & Francesco Scarano,  at this URL could be found the original article:  http://blogs.oracle.com/slc/coherence_into_the_cloud_boost. Thinking about the enterprise cloud, come to mind many possible configurations and new opportunities in enterprise environments. Various customers needs that serve as guides to this new trend are often very different, but almost always united by two main objectives: Elasticity of infrastructure both Hardware and Software Investments related to the progressive needs of the current infrastructure Characteristics of innovation and economy. A concrete use case that I worked on recently demanded the fulfillment of two basic requirements of economy and innovation.The client had the need to manage a variety of data cache, which can process complex queries and parallel computational operations, maintaining the caches in a consistent state on different server instances, on which the application was installed.In addition, the customer was looking for a solution that would allow him to manage the likely situations in load peak during certain times of the year.For this reason, the customer requires a replication site, on which convey part of the requests during periods of peak; the desire was, however, to prevent the immobilization of investments in owned hardware-software architectures; so, to respond to this need, it was requested to seek a solution based on Cloud technologies and architectures already offered by the market. Coherence can already now address the requirements of large cache between different nodes in the cluster, providing further technology to search and parallel computing, with the simultaneous use of all hardware infrastructure resources. Moreover, thanks to the functionality of "Push Replication", which can replicate and update the information contained in the cache, even to a site hosted in the cloud, it is satisfied the need to make resilient infrastructure that can be based also on nodes temporarily housed in the Cloud architectures. There are different types of configurations that can be realized using the functionality "Push-Replication" of Coherence. Configurations can be either: Active - Passive  Hub and Spoke Active - Active Multi Master Centralized Replication Whereas the architecture of this particular project consists of two sites (Site 1 and Site Cloud), between which only Site 1 is enabled to write into the cache, it was decided to adopt an Active-Passive Configuration type (Hub and Spoke). If, however, the requirement should change over time, it will be particularly easy to change this configuration in an Active-Active configuration type. Although very simple, the small sample in this post, inspired by the specific project is effective, to better understand the features and capabilities of Coherence and its configurations. Let's create two distinct coherence cluster, located at miles apart, on two different domain contexts, one of them "hosted" at home (on-premise) and the other one hosted by any cloud provider on the network (or just the same laptop to test it :)). These two clusters, which we call Site 1 and Site Cloud, will contain the necessary information, so a simple client can insert data only into the Site 1. On both sites will be subscribed a listener, who listens to the variations of specific objects within the various caches. To implement these features, you need 4 simple classes: CachedResponse.java Represents the POJO class that will be inserted into the cache, and fulfills the task of containing useful information about the hypothetical links navigation ResponseSimulatorHelper.java Represents a link simulator, which has the task of randomly creating objects of type CachedResponse that will be added into the caches CacheCommands.java Represents the model of our example, because it is responsible for receiving instructions from the controller and performing basic operations against the cache, such as insert, delete, update, listening, objects within the cache Shell.java It is our controller, which give commands to be executed within the cache of the two Sites So, summarily, we execute the java class "Shell", asking it to put into the cache 100 objects of type "CachedResponse" through the java class "CacheCommands", then the simulator "ResponseSimulatorHelper" will randomly create new instances of objects "CachedResponse ". Finally, the Shell class will listen to for events occurring within the cache on the Site Cloud, while insertions and deletions are performed on Site 1. Now, we realize the two configurations of two respective sites / cluster: Site 1 and Site Cloud.For the Site 1 we define a cache of type "distributed" with features of "read and write", using the cache class store for the "push replication", a functionality offered by the project "incubator" of Oracle Coherence.For the "Site Cloud" we expect even the definition of “distributed” cache type with tcp proxy feature enabled, so it can receive updates from Site 1.  Coherence Cache Config XML file for "storage node" on "Site 1" site1-prod-cache-config.xml Coherence Cache Config XML file for "storage node" on "Site Cloud" site2-prod-cache-config.xml For two clients "Shell" which will connect respectively to the two clusters we have provided two easy access configurations.  Coherence Cache Config XML file for Shell on "Site 1" site1-shell-prod-cache-config.xml Coherence Cache Config XML file for Shell on "Site Cloud" site2-shell-prod-cache-config.xml Now, we just have to get everything and run our tests. To start at least one "storage" node (which holds the data) for the "Cloud Site", we can run the standard class  provided OOTB by Oracle Coherence com.tangosol.net.DefaultCacheServer with the following parameters and values:-Xmx128m-Xms64m-Dcom.sun.management.jmxremote -Dtangosol.coherence.management=all -Dtangosol.coherence.management.remote=true -Dtangosol.coherence.distributed.localstorage=true -Dtangosol.coherence.cacheconfig=config/site2-prod-cache-config.xml-Dtangosol.coherence.clusterport=9002-Dtangosol.coherence.site=SiteCloud To start at least one "storage" node (which holds the data) for the "Site 1", we can perform again the standard class provided by Coherence  com.tangosol.net.DefaultCacheServer with the following parameters and values:-Xmx128m-Xms64m-Dcom.sun.management.jmxremote -Dtangosol.coherence.management=all -Dtangosol.coherence.management.remote=true -Dtangosol.coherence.distributed.localstorage=true -Dtangosol.coherence.cacheconfig=config/site1-prod-cache-config.xml-Dtangosol.coherence.clusterport=9001-Dtangosol.coherence.site=Site1 Then, we start the first client "Shell" for the "Cloud Site", launching the java class it.javac.Shell  using these parameters and values: -Xmx64m-Xms64m-Dcom.sun.management.jmxremote -Dtangosol.coherence.management=all -Dtangosol.coherence.management.remote=true -Dtangosol.coherence.distributed.localstorage=false -Dtangosol.coherence.cacheconfig=config/site2-shell-prod-cache-config.xml-Dtangosol.coherence.clusterport=9002-Dtangosol.coherence.site=SiteCloud Finally, we start the second client "Shell" for the "Site 1", re-launching a new instance of class  it.javac.Shell  using  the following parameters and values: -Xmx64m-Xms64m-Dcom.sun.management.jmxremote -Dtangosol.coherence.management=all -Dtangosol.coherence.management.remote=true -Dtangosol.coherence.distributed.localstorage=false -Dtangosol.coherence.cacheconfig=config/site1-shell-prod-cache-config.xml-Dtangosol.coherence.clusterport=9001-Dtangosol.coherence.site=Site1  And now, let’s execute some tests to validate and better understand our configuration. TEST 1The purpose of this test is to load the objects into the "Site 1" cache and seeing how many objects are cached on the "Site Cloud". Within the "Shell" launched with parameters to access the "Site 1", let’s write and run the command: load test/100 Within the "Shell" launched with parameters to access the "Site Cloud" let’s write and run the command: size passive-cache Expected result If all is OK, the first "Shell" has uploaded 100 objects into a cache named "test"; consequently the "push-replication" functionality has updated the "Site Cloud" by sending the 100 objects to the second cluster where they will have been posted into a respective cache, which we named "passive-cache". TEST 2The purpose of this test is to listen to deleting and adding events happening on the "Site 1" and that are replicated within the cache on "Cloud Site". In the "Shell" launched with parameters to access the "Site Cloud" let’s write and run the command: listen passive-cache/name like '%' or a "cohql" query, with your preferred parameters In the "Shell" launched with parameters to access the "Site 1" let’s write and run the following commands: load test/10 load test2/20 delete test/50 Expected result If all is OK, the "Shell" to Site Cloud let us to listen to all the add and delete events within the cache "cache-passive", whose objects satisfy the query condition "name like '%' " (ie, every objects in the cache; you could change the tests and create different queries).Through the Shell to "Site 1" we launched the commands to add and to delete objects on different caches (test and test2). With the "Shell" running on "Site Cloud" we got the evidence (displayed or printed, or in a log file) that its cache has been filled with events and related objects generated by commands executed from the" Shell "on" Site 1 ", thanks to "push-replication" feature.  Other tests can be performed, such as, for example, the subscription to the events on the "Site 1" too, using different "cohql" queries, changing the cache configuration,  to effectively demonstrate both the potentiality and  the versatility produced by these different configurations, even in the cloud, as in our case. More information on how to configure Coherence "Push Replication" can be found in the Oracle Coherence Incubator project documentation at the following link: http://coherence.oracle.com/display/INC10/Home More information on Oracle Coherence "In Memory Data Grid" can be found at the following link: http://www.oracle.com/technetwork/middleware/coherence/overview/index.html To download and execute the whole sources and configurations of the example explained in the above post,  click here to download them; After download the last available version of the Push-Replication Pattern library implementation from the Oracle Coherence Incubator site, and download also the related and required version of Oracle Coherence. For simplicity the required .jarS to execute the example (that can be found into the Push-Replication-Pattern  download and Coherence Distribution download) are: activemq-core-5.3.1.jar activemq-protobuf-1.0.jar aopalliance-1.0.jar coherence-commandpattern-2.8.4.32329.jar coherence-common-2.2.0.32329.jar coherence-eventdistributionpattern-1.2.0.32329.jar coherence-functorpattern-1.5.4.32329.jar coherence-messagingpattern-2.8.4.32329.jar coherence-processingpattern-1.4.4.32329.jar coherence-pushreplicationpattern-4.0.4.32329.jar coherence-rest.jar coherence.jar commons-logging-1.1.jar commons-logging-api-1.1.jar commons-net-2.0.jar geronimo-j2ee-management_1.0_spec-1.0.jar geronimo-jms_1.1_spec-1.1.1.jar http.jar jackson-all-1.8.1.jar je.jar jersey-core-1.8.jar jersey-json-1.8.jar jersey-server-1.8.jar jl1.0.jar kahadb-5.3.1.jar miglayout-3.6.3.jar org.osgi.core-4.1.0.jar spring-beans-2.5.6.jar spring-context-2.5.6.jar spring-core-2.5.6.jar spring-osgi-core-1.2.1.jar spring-osgi-io-1.2.1.jar At this URL could be found the original article: http://blogs.oracle.com/slc/coherence_into_the_cloud_boost Authors: Nino Guarnacci & Francesco Scarano

    Read the article

  • Site Review: Facebook.com and Blockbuster.com - Navigation Schemes

    After cycling through a list of my favorite sites I decided to select Facebook.com and Blockbuster.com for this  post because I found their navigation schemes very intuitive. Facebook in my opinion took a very simplistic and minimalistic approach when they designed their site and its navigation. For example, when you login to your account you will find on the upper left hand side a generic section of the site common areas to all users like news, messages, events, photos and friends. Below this in a separate navigation menu is a list of applications that a user has elected to access through bookmarks. Finally in the upper right hand corner of the site contains links to administer the user’s account like account settings, public profile, and a link back to the users’ home page. Blockbuster on the other had tried to make site navigation a little more slick by using a menu-submenu approach to navigation where user can click on things like Rent, Buy, On Demand, Games, Stores, and Gifts and a submenu of corresponding items appears below the original menu item. In addition they also took this approach and added categorized lists of movies that they offer on the homepage so that users can click on an item like “DVD Spotlight” and a list of movies represented as actual DVD box cases appear on the user’s screen so they can scroll through the list by using left and right arrows on either side of the images displayed. Both Facebook and Blockbuster have more than one navigation groupings because their respected sites are so large and offer an absorbent amount of features. Because of this reason they have to group the main functionality of information in to logical groups based on their actions they perform and the access to specific information. For example it would not make sense for Facebook to include a particular game you like to play within your account with a section pertaining to account administration. The game link would be completely out of place and really confuse the users experience because the groupings where not logically grouped. In addition I think that Facebook users would benefit if Facebook allowed its users to specify what they want on the general navigation from within their site or at least create a section to show frequently accessed pages or favorite sections. Finally regarding additional navigation, I think blockbuster users really benefit from the submenu system of categorizing data, and if fact Blockbuster even allows them to refine the information they are looking for through the use of secondary submenu systems allowing users to really drill down in to what they are looking for to learn more on. I do not think that having more than one navigation bar on a web page is not confusing for the user. For example if you have a navigation bar at the top of your page and at the bottom will allow users to move around the website easier because they can utilize the navigation closest to where their cursor is on the page. In regards to designers forcing all the navigation in to one navigation bar, I think it would be hard for the user to fully understand what is going on based on the size and complexity of the site they are dealing with. For example Blockbuster has a ton of content that could not easily be put in to one navigation bar. From my experience with both Facebook and Blockbuster, they both do a good job with cross browser compatibility. I have had no issues with either site in IE, Firefox, Chrome, and Safari over the years. In addition, I do not believe that either Facebook or Blockbuster require any additional plug-in to utilize their navigation bars.

    Read the article

  • GA UA codes for testing site - set up

    - by Drew
    Anyone know the process for using a GA live UA code to test a site in development. I.e. I have a live site with a GA UA code attached, tracking live traffic data etc e.g. UA-123456. I've been told that there is a way to produce another code associated to the primary code to use on the testing version of my live site e.g. test code could be UA-123457. Can anyone shed some light on this? If not possible should I just set up a completely separate GA account for my testing site?

    Read the article

  • Set up development site on another server/host

    - by Ofeargall
    I'm developing a site for a client. They've got a site now that's hosted at hosting.com. I'm going to move them to my VM hosting solution at edge web but I want to run some tests and have the client approve the site before changing the name servers to the new site/hosting location. How do I make this happen? I'm running a red hat/Apache on linux for the edge web hosting. I don't have control of the domain name (i.e. the client controls that right now). Edgeweb has set up a dns zone for the domain name so that when the time comes to switch we're ready to go. I'm a web developer and I understand the technologies that make a user experience 'work' but I'm unfamiliar with the server jargon and all that so, please be patient. Thanks in advance.

    Read the article

  • New A-Team Web Site Launched

    - by .raja
    The A-Team has launched a new web site – the A-Team Chronicles which aggregates and organizes content produced by The A-Team members (including your humble blogger). The A-Team is a central, outbound, highly technical team comprised of Enterprise Architects, Solution Specialists and Software Engineers within the Fusion Middleware Product Development Organization that works with customers and partners, world wide, providing guidance on implementation best practices, architecture, troubleshooting and how best to use Oracle products to solve customer business needs. This content captures best practices, tips and tricks and guidance that the A-Team members gain from real-world experiences, working with customers and partners on implementation projects, through Architecture reviews, issue resolution and more. A-Team Chronicles makes this content available, through short and to the point articles to all our customers and partners in a consistent, easy to find and organized way. If you like the articles we post here, you might find even more interesting articles at the new A-Team Chronicles site, covering a wider range of Fusion Middleware topics. We will be decommissioning this site shortly in favor of A-Team Chronicles site and all new contents will be posted there.

    Read the article

  • SEO for site with 301 redirect on root domain to subfolder

    - by Kim
    I've been asked to do SEO for a site. The site is made using Wordpress and prestashop. Because of this the root domain has a 301 redirect to a subfolder - domain/shop/ For my SEO submission work, I know it's not good practice to submit urls that have redirects on them and a lot of the time it's not allowed. After searching the net I think my best bet is to do all my site submissions using the url - domain/shop/ even though it will take a lot more listings to get them up in ranking compared to using their root domain. I'm not sure how it will work. The root domain has the greatest rank then passes rank to the rest of the site. If I'm targeting the subfolder will it work?

    Read the article

  • Strange Google Analytics result when new site launched

    - by Howard
    I have a web site which is mainly contains a few pages, and now we revamp-ed a new site which contains several hundred pages. We have Strange Google Analytics result, as follow: Before: Traffic sources (all traffic): 674 Content (all pages, unique PV): 674 After: Traffic sources (all traffic): 291 Content (all pages, unique PV): 1235 As you can see, the unique PV has increased as expected (as we have more pages now and the site is better), but why the traffic sources is lower and has a large gap? Any idea?

    Read the article

  • Script to determine the SSL certificate assigned to each site

    - by Thomas
    I have a IIS6 web server with 100+ sites on it. Recently, I was forced to renew my wildcard SSL certificate which all the sites use by creating a new CSR request rather than a renew CSR request. I have installed the certificate and can update each site one at a time to use the new certificate however, I was wondering whether: There is a way to update every site at the same time and If there was a script I can use to view which certificate is currently being used by each site.

    Read the article

  • Subscription based eCommerce Site selling Physical Goods

    - by Kash
    I want to implement a system where customers can buy physical products as subscription with recurring billing. Customers will come to my site, visit product page and buy stuff. They need to register an account before purchasing. After registration they will be forwarded to payment processor for payment and after successful payment they will be returned back to my site. Requirement: I dont want to use Paypal as a payment processor. It will be recurring payment on monthly, quarterly, bi-annual or annual basis. Customers can login to my site in order to view their Order History, Profile Management and can edit their Billing and/or Shipping details. Customers can cancel their Subscription any time after login to their account on my site. Plan: Currently I am using wordpress with WooCommerce plugin for eCommerce functionality. I am planning to use Amazon as a payment processor. I have searched WooCommerce Extension Drectory and found WC Subscription extension that can be used for my site. But I just confirmed from the Support WC Subscription only support recurring payment with Paypal and Stripe, no other payment processor is supported at the moment. Questions: According to my need, what will be the best solution? I am open to switch the platform from Wordpress to something else that can fulfill my requirement.

    Read the article

  • Data transfer between "main" site and secured virtual subsite

    - by Emma Burrows
    I am currently working on a C# ASP.Net 3.5 website I wrote some years ago which consists of a "main" public site, and a sub-site which is our customer management application, using forms-based authentication. The sub-site is set up as a virtual folder in IIS and though it's a subfolder of "main", it functions as a separate web app which handles CRUD access to our customer database and is only accessible by our staff. The main site currently includes a form for new leads to fill in, which generates an email to our sales staff so they can contact them and convince them to become customers. If that process is successful, the staff manually enter the information from the email into the database. Not surprisingly, I now have a new requirement to feed the data from the new lead form directly into the database so staff can just check a box for instance to turn the lead into a customer. My question therefore is how to go about doing this? Possible options I've thought of: Move the new lead form into the customer database subsite (with authentication turned off). Add database handling code to the main site. (No, not seriously considering this duplication of effort! :) Design some mechanism (via REST?) so a webpage outside the customer database subsite can feed data into the customer database How to organise the code for this situation, preferably with extensibility in mind, and particularly are there any options I haven't thought of?

    Read the article

  • Google I/O 2010 - Bringing Google to your site

    Google I/O 2010 - Bringing Google to your site Google I/O 2010 - Bringing Google to your site Google APIs 101 DeWitt Clinton, Jeff Scudder This is an overview session about some of the many ways that a developer can enrich their site and more fully engage their visitors using Google products. We will cover a variety of products and APIs designed to quickly and easily improve and monetize your site, from AdSense and Custom Search to Feeds and Web Elements. We'll include announcements for several eye-popping new features. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 8 0 ratings Time: 57:26 More in Science & Technology

    Read the article

  • Site Web Analytics not updating Sharepoint 2010

    - by Rohit Gupta
    If you facing the issue that the web Analytics Reports in SharePoint 2010 Central Administration is not updating data. When you go to your site > site settings > Site Web Analytics reports or Site Collection Analytics reports  You get old data as in the ribbon displayed "Data Last Updated: 12/13/2010 2:00:20 AM" Please insure that the following things are covered: Insure that Usage and Data Health Data Collection service is configured correctly. Log Collection Schedule is configured correctly Microsoft Sharepoint Foundation Usage Data Import and Microsoft SharePoint Foundation Usage Data Processing Timer jobs are configured to run at regular intervals One last important Timer job is the Web Analytics Trigger Workflows Timer Job insure that this timer job is enabled and scheduled to run at regular intervals (for each site that you need analytics for). After you have insured that the web analytics service configuration is working fine and the Usage Data Import job is importing the *.usage files from the ULS LOGS folder into the WSS_Logging database, and that all the required timer jobs are running as expected… wait for a day for the report to get updated… the report gets updated automatically at 2:00 am in the morning… and i could not find a way to control the schedule for this report update job. So be sure to wait for a day before giving up :)

    Read the article

  • Can't install or update Ubuntu after using parameter acpi_osi = Linux

    - by Lucas Leitão
    I recently had an issue with my acer 4736z notebook because I was having a blank screen after booting the OS, then someone told me to use the parameter GRUB_CMDLINE_LINUX_DEFAULT = "quiet splash acpi_osi = Linux" after quiet splash inside the grub. It worked for me, but since then I can't install a thing or update anything on Linux because it says Removing linux-image-extra-3.5.0-17-generic ... Examining /etc/kernel/postrm.d . run-parts: executing /etc/kernel/postrm.d/initramfs-tools 3.5.0-17-generic /boot/vmlinuz-3.5.0-17-generic update-initramfs: Deleting /boot/initrd.img-3.5.0-17-generic run-parts: executing /etc/kernel/postrm.d/zz-update-grub 3.5.0-17-generic /boot/vmlinuz-3.5.0-17-generic /usr/sbin/grub-mkconfig: 11: /etc/default/grub: GRUB_CMDLINE_LINUX_DEFAULT: not found run-parts: /etc/kernel/postrm.d/zz-update-grub exited with return code 127 Failed to process /etc/kernel/postrm.d at /var/lib/dpkg/info/linux-image-extra-3.5.0-17-generic.postrm line 328. dpkg: erro ao processar linux-image-extra-3.5.0-17-generic (--remove): sub-processo script post-removal instalado retornou estado de saída de erro 1 Removendo linux-image-3.5.0-17-generic ... Examining /etc/kernel/postrm.d . run-parts: executing /etc/kernel/postrm.d/initramfs-tools 3.5.0-17-generic /boot/vmlinuz-3.5.0-17-generic update-initramfs: Deleting /boot/initrd.img-3.5.0-17-generic run-parts: executing /etc/kernel/postrm.d/zz-update-grub 3.5.0-17-generic /boot/vmlinuz-3.5.0-17-generic /usr/sbin/grub-mkconfig: 11: /etc/default/grub: GRUB_CMDLINE_LINUX_DEFAULT: not found run-parts: /etc/kernel/postrm.d/zz-update-grub exited with return code 127 Failed to process /etc/kernel/postrm.d at /var/lib/dpkg/info/linux-image-3.5.0-17-generic.postrm line 328. dpkg: erro ao processar linux-image-3.5.0-17-generic (--remove): sub-processo script post-removal instalado retornou estado de saída de erro 1 Erros foram encontrados durante o processamento de: linux-image-extra-3.5.0-17-generic linux-image-3.5.0-17-generic E: Sub-process /usr/bin/dpkg returned an error code (1) I already tried to remove older kernels but it gives me the same message. Do you have a clue about what should I do?

    Read the article

  • Why my site is not ranking for particular keyword

    - by user543087
    My site is only 3 days to be 6 months old. This website is unique, that is there is no competitor to this type site in India, providing comparison of payment gateways in India, besides the payment gateways companies itself. I've optimized it for key word : "payment gateway" I've changed the url's twice, latest being 3 months back, in which case Google Webmaster gave plently of 404's. I corrected the useful 404's and left meaningless ones as it is. What is the reason it's not ranking well for payment gateways? Even site with single page about "Payment gateways" seem to be ranking better than this. Is it does to: 1) Lot of outbound links to in-context companies and information 2) 404's as reported in Google Webmaster My another site is successfully getting 1500 unique visitors daily and is up in Google ranking. I don't know why it is not!

    Read the article

  • is it possible to sell a web site

    - by Bogdan0x400
    There might be a situation where one of my clients won't pay for the web site that I've made. So I am wondering if it is possible to sell a web site? It is an internet shop, so there is no content that comes with it, but the source code is fully available, and it has a decent design. I've seen people trying to sell web site templates, and I've seen people who try to sell already running web sites, and there are plenty of commercial web site engines out there. But what about raw empty web sites, is there a market for them?

    Read the article

  • Good design for a simple site that contains a blog

    - by bporter
    What is a good design for a simple web site with mostly static pages and a blog? I am helping a friend build this for their small business. We are looking for a simple approach that can be implemented fairly quickly. (I am a programmer and can help with coding, hosting, etc.) One option is to use a site like virb, which lets you choose from one of their themes and build a site pretty easily. You can also include a blog. They host the site for a pretty low monthly rate. I recommended this option, but my friend wants a design that is unique and custom. So, I took one of the themes and started modifying the HTML and CSS. This might still be a good option, but... ...If we are going to greatly modify it, why not just create the static pages from scratch and use something like Wordpress for the blog. Is this a good option? It looks fairly easy to integrate Wordpress with a site so that the design and behavior are really cohesive. Is this a good idea? Do you recommend any other approaches?

    Read the article

  • How can one keep an ecommerce site active?

    - by Mantorok
    So, you build an e-commerce site, all your products are on there, but then very little changes which obviously causes your site to become less active, and ultimately not ranking as highly in search engines. Is there anything that can be done to keep it active? I'm aware that inbound links are important and I guess these come over time, are there any other recommended means of keeping the site active?

    Read the article

  • Services - Separate Sites or One Site - Impact on SEO

    - by Lynda
    I have a client who is a lawyer that specializes in Criminal Defense and DUI, however, he does not show up well in Google. In researching the sites that rank better have much more content for those specialties than his site does and my thought it that he needs to add more quality content to rank better for those searches. On his site he mentions his specialties, but also he has various personal things on his sites that reflect his interest. These are clearly separated from the business portion. My questions are should he 1) separate his personal information into a new domain and 2) should he have a separate URL for each of his specialties? OR would one URL work as long as everything is clearly separated? I read once that for legal services to rank well you should make a separate site for each specialty and have that site focus solely on that service.

    Read the article

  • Managing products on a an ecommerce site [closed]

    - by John
    I've had a site that sells widgets for many years. I do not inventory my widgets, but the cost of adding them to the site and makings sure the site is current is becoming cost prohibitive. Here are the facts: I sell a single class of widget. I have about 50,000 widgets on my site. I have about 100 vendors that create and dropship the products when they get an order from me via email. Each vendor carries from 50 to 5000 types of widgets. Vendors all have websites with images and descriptions of their products. Each widget is produced in limited supply and usually sell out in 1-5 years. Prices of the widget often go up, sometimes more than 50% before they sell out. My vendors aren't very tech sophisticated. They have websites with their products, but most can't supply an api or database dump. Their websites usually display retail prices to the public, but I login or refer to a price list (usually excel) for wholesale prices. As it stands now, I hire local people to add and describe each widget to our website. It usually takes a person 4 minutes to add a widget to the site. This doesn't include moving to a new vendor. I feel like the upload/edit process is as good as it can get via a form/website. The problem is that it is getting very expensive to upload and keep the widget inventory current. I often get orders for something after it's sold out from the vendor or the price is wrong. This seems like it would be a problem in many industries. Can anyone suggest the cheapest way to upload inventory and ensure prices are current from my vendors? I'm assuming it will involve outsourcing, but I would like ideas on how to setup the compensation model.

    Read the article

  • The JavaFX Community Site on Java.net

    - by Tori Wieldt
    Community activity surrounding JavaFX has been steadily growing, with tweets, blog posts, and projects increasing in number. We are pleased to announce that there is now a JavaFX community site on Java.net at the following URL: javafxcommunity.com  This site is an aggregator of JavaFX information, where you can find links to JavaFX blog posts, tweets, and other resources.  Gerrit Grunwald and Jim Weaver are the community leaders for this site, and they welcome your feedback on how to make the JavaFX Community site more useful to you! Learn more on Jim Weaver’s Rich-Client Java Blog. 

    Read the article

  • Does redirecting old site's URLs to new site's front page hurt a page's ranking?

    - by Kaivosukeltaja
    An old site that is being rewritten needs to have it's URLs redirected to the new site. There are a few hundred pages that may or may not have corresponding pages on the new site, probably with different slugs, and adding mappings manually will require more hours than we can spare. It was suggested that all old URLs be redirected to the new front page, but I remember reading somewhere that this confers a penalty in page rank because it's what link farmers do. Is this true or can we take the easy way out?

    Read the article

  • Which is the best Question and Answers site ?

    - by Geek
    Hi, Which site do you think is the best site for Question/Answers for a developer incase of technical questions. Good old stackoverflow seems to be losing it's charm, if you go through it you will find loads and loads of questions with very few answers. Infact most questions dont even have enough reads on them. Has any new site come up which has gained recent popularity for technical questions ? thanks.

    Read the article

  • Working out costs to implement WCAG 2.0 (AA) site

    - by Sixfoot Studio
    Hi, I've run our client's site through a WCAG 2.0 validator which has returned 415 tasks that need to be worked through in order to get it WCAG 2.0 compliant. For the most part, I can get a rough estimation of how long a task will take but there are tasks I have never had to do before which I am not sure how to cost. I would like to know if someone has a rough guide on what to cost a client to convert their site to a compliant WCAG 2.0 (AA) site. Many thanks

    Read the article

  • Redisigning an old site, structure change etc

    - by RhymeGuy
    I have an old site built in 2006, it has around 200 pages and 500 pictures. Every single page is of course indexed as well as images. It is very well ranked for targeted keywords and I receive good amount of SEO traffic (I guess that's due the various campaigns, branding, ppc, etc..) Problem: Site has outdated design, pages and images have not so proper names, there are no heading and alt tags, it was built in tables, inline CSS etc.. Goal: Complete redisign site, use divs, change file names, add proper meta data, alt tags etc.. Question: How this can affect current SEO positions? I will redirect (301) every single page to the new one, build site map, but what to do with images? Do I need to redirect them also? Any other suggestion?

    Read the article

< Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >