Search Results

Search found 4786 results on 192 pages for 'traffic shaping'.

Page 143/192 | < Previous Page | 139 140 141 142 143 144 145 146 147 148 149 150  | Next Page >

  • Shared to Dedicated or Amazon CloudFront to improve performances and keep secured?

    - by user978548
    I have a Wordpress which currently takes about 1.8s to 2.5s for the home page to completely load in my country. The page weight is about 700Ko (static content included). In order to increase performances, I'm considering two solutions: Switching to a dedicated host. Using amazon s3 cloudfront to serve static contents. My current shared hosting have servers in a neighboring country but not exactly in mine, and both amazon and the dedicated hosting have some, so that's already an advantage. So considering all that, I still have three questions remaining: Currently having a low traffic (100 unique visitors/days, but growing) will it make a huge difference between my shared hosting and a dedicated server ? Knowing that I already use a cookie-less domain to deliver static contents (but using a redirection to the same server), would using amazon s3 make a real difference ? Talking about the cons of dedicated vs amazon s3, if I choose for the dedicated server something like Ubuntu server and do daily package updates and have only port 80 open, would it be sufficient in terms of security (in comparison with my current shared hosting which manage everything for me) ?

    Read the article

  • Antenna Aligner Part 3: Kaspersky

    - by Chris George
    Quick one today. Since starting this project, I've been encountering times where Nomad fails to build my app. It would then take repeated attempts at building to then see a build go through successfully. Rob, who works on Nomad at Red Gate, investigated this and it showed that certain parts of the message required to trigger the 'cloud build' were not getting through to the Nomad app, causing the HTTP connection to stall until timeout. After much scratching heads, it turns out that the Kaspersky Internet Security system I have installed on my laptop at home, was being very aggressive and was causing the problem. Perhaps it's trying to protect me from myself? Anyway, we came up with an interim solution why the Nomad guys investigate with Kaspersky by setting Visual Studio to be a trusted application with the Kaspersky settings and setting it to not scan network traffic. Hey presto! This worked and I have not had a single build problem since (other than losing internet connection, or that embarrassing moment when you blame everyone else then realise you've accidentally switched off your wireless on the laptop).

    Read the article

  • Updating an ADF Web Service Data Control When Service Structure or Location Change

    - by Shay Shmeltzer
    The web service data control in Oracle ADF gives you a simplified approach to consuming services in ADF applications, and now with ADF Mobile the usage of this service seems to be growing. A frequent question we get is what happens if the service that I'm consuming changes - how do I update my data control? Well, first we should mention that if you do a good design of your application before you actually code - then things like Web service method signature shouldn't change. The signature is the contract between the publisher and the consumer, and contracts shouldn't be broken. But in reality things do change during development stages, so here is how you can update both method signatures and service location with the Web service data control: After watching this video you might be tempted to not copy the WSDLs to your project - which lets you use the right click update on a data control. However there is a reason why the copy is on by default, it reduces network traffic when you are actually running your application since ADF doesn't need to go to the server to find out the service structure. So for runtime performance, you probably should keep the WSDL local.  I encourage you to further look into both the connections.xml file where your service location is saved, and the datacontrols.dcx file where its definition is kept to get an even deeper understanding of how ADF works underneath the declarative layers.

    Read the article

  • Mobile cross-platform SDK for computationally intensive apps

    - by K.Steff
    I am aware of the PhoneGap toolkit for creating mobile applications for virtually all mobile platforms with a significant market share. However, the code in PhoneGap that is shared between the different platforms is written in JavaScript. While I like JS, I think it's hardly appropriate for computationally intensive tasks. The situation with Titanium is pretty much the same. So, is there any way that I can create a cross-platform mobile app that has the computationally intensive code shared between the platforms? Some context: Obviously, I don't want to implement the time consuming algorithm in many different languages, since this violates DRY, increases the chance for bugs slipping in at least one version and require boilerplate code to work. I've looked at Xamarin's MonoTouch and Mono for Android tools, but while they cover iOS and Android, they're not nearly as versatile for deployment as PhoneGap. On the other hand, (IMO) the statically typed nature of C# is more suited for intense computation than JS. Are there any other SDK/tools appropriate for the task that I don't know about or a point about the mentioned above that I've missed? Also, uploading data to a web service for processing is not an option, because of the traffic required.

    Read the article

  • Get to Know a Candidate (6 of 25): Jill Stein&ndash;Green Party

    - by Brian Lanham
    DISCLAIMER: This is not a post about “Romney” or “Obama”. This is not a post for whom I am voting. Information sourced for Wikipedia. Stein is a physician with degrees from Harvard College and Harvard Medical School.  She serves on the boards of Greater Boston Physicians for Social Responsibility and MassVoters for Fair Elections, and has been active with the Massachusetts Coalition for Healthy Communities Jill Stein advocates a "Green New Deal" in which renewable energy jobs would be created to address climate change and environmental issues with the objective of employing "every American willing and able to work". Citing the research of Dr. Phillip Harvey, Professor of Law & Economics at Rutgers University, as evidence of the successful economic effects of the 1930s' New Deal projects, Stein would fund the plan with a 30% reduction in the U.S. military budget, returning US troops home, and increasing taxes on areas such as capital gains, offshore tax havens and multimillion dollar real estate. Stein plans on impacting what she sees as a growing convergence of environmental crises in water, soil, fisheries and forests, through the creation of sustainable infrastructure based in clean renewable energy generation and sustainable communities principles such as increasing intra-city mass transit and inter-city railroads, creating 'complete streets' that safely encourage bike and pedestrian traffic and regional food systems based on sustainable organic agriculture The Green Party of the United States was founded in 1991 as a voluntary association of state green parties. With its founding, the Green Party of the United States became the primary national Green organization in the United States, eclipsing the Greens/Green Party USA, which emphasized non-electoral movement building. The Green Party of the United States of America emphasizes environmentalism, non-hierarchical participatory democracy, social justice, respect for diversity, peace and nonviolence. Their "Ten Key Values," which are described as non-authoritative guiding principles, are as follows: Grassroots democracy Social justice and equal opportunity Ecological wisdom Nonviolence Decentralization Community-based economics Feminism and gender equality Respect for diversity Personal and global responsibility Future focus and sustainability The Green Party does not accept donations from corporations. Thus, the party's platforms and rhetoric critique any corporate influence and control over government, media, and American society at large. Stein has access to 403 electoral votes and is a write-in candidate in GA, IN, and MS Learn more about Jill Stein and Green Party on Wikipedia.

    Read the article

  • Antenna Aligner Part 3: Kaspersky

    - by Chris George
    Quick one today. Since starting this project, I've been encountering times where Nomad fails to build my app. It would then take repeated attempts at building to then see a build go through successfully. Rob, who works on Nomad at Red Gate, investigated this and it showed that certain parts of the message required to trigger the 'cloud build' were not getting through to the Nomad app, causing the HTTP connection to stall until timeout. After much scratching heads, it turns out that the Kaspersky Internet Security system I have installed on my laptop at home, was being very aggressive and was causing the problem. Perhaps it's trying to protect me from myself? Anyway, we came up with an interim solution why the Nomad guys investigate with Kaspersky by setting Visual Studio to be a trusted application with the Kaspersky settings and setting it to not scan network traffic. Hey presto! This worked and I have not had a single build problem since (other than losing internet connection, or that embarrassing moment when you blame everyone else then realise you've accidentally switched off your wireless on the laptop).

    Read the article

  • First Experience with Web Services

    When I first started programming with Microsoft .Net (1.0 Framework) I had a strong desire to learn how search engines indexed web sites. At that time I was a working as a search engine spammer creating web pages to generate traffic for specific themes for various clients. One way I attempted to better understand .Net was to build a web spider that analyzed web pages on demand. An example of the spider is hosted at AddLinkz.com. After my spider was built I had no real idea what I could/should do with it until I found the MSN Search API. I used this web service to compare its results with my spider. Additionally, I used the API to feed my .Net web spider new URLs from the API based on specific search terms. MSN’s search API was very easy to use, I just had to request information by calling a web URL with parameters via a Get request and the results were returned in XML. At that time all requests were limited to XML responses and a maximum of 1,000 results per query.   Since then the entire API has gone through several reconstructions, rebranding and new search services.  Microsoft’s new Bing API replaced the older MSN search API and added several new search capabilities. These new features allow search data to be returned for web searches, image searches, new searches, and related search terms to name a few. Bing API Version 2.0 SourceTypes Web Searches for web content Sushi Image Searches for images on the web Sushi News Searches news stories Sushi InstantAnswer Searches Encarta online what is sushi, convert 5 feet to meters, x*5=7, and 2 plus 2 Spell Searches Encarta dictionary for spelling suggestions Phonebook Searches phonebook entries sushi in Los Angeles RelatedSearch Returns the query strings most similar to yours Ad Returns advertisements to incorporate with results I currently plan to start using the web search feature from the new Bing 2.0 API in an open source project related to exception management. Currently, it is still in the conception phase.

    Read the article

  • Which is the best image hosting site for hosting images for the website? [closed]

    - by rahul dagli
    Possible Duplicate: How to find web hosting that meets my requirements? I currently have a website and blog and using a limited web hosting plan. When I upload images on my hosting server it consumes a lot of bandwidth and space. So I was thinking of hosting images on some-other image hosting site and direct linking it to my site. I found out few sites like imageshack, photobucket, tinypic, imgur. However, I see all have certain restrictions. The features i am looking for are as follows: 1. At least 10gb space 2. At least 500gb bandwidth (bec I hav very high traffic) 3. Very high speed even during heavy load like 1000 visitors accessing every hour. 4. Ultra reliable servers (99.9% uptime) 5. Privacy control 6. Must not ever delete image if inactive 7. Create and manage albums 8. Company that will last long in business atleast for next 10 years. 9. Free of cost 10. Hotlinking/ Directlinking image.

    Read the article

  • Trigger IP ban based on request of given file?

    - by Mike Atlas
    I run a website where "x.php" was known to have vulnerabilities. The vulnerability has been fixed and I don't have "x.php" on my site anymore. As such with major public vulnerabilities, it seems script kiddies around are running tools that hitting my site looking for "x.php" in the entire structure of the site - constantly, 24/7. This is wasted bandwidth, traffic and load that I don't really need. Is there a way to trigger a time-based (or permanent) ban to an IP address that tries to access "x.php" anywhere on my site? Perhaps I need a custom 404 PHP page that captures the fact that the request was for "x.php" and then that triggers the ban? How can I do that? Thanks! EDIT: I should add that part of hardening my site, I've started using ZBBlock: This php security script is designed to detect certain behaviors detrimental to websites, or known bad addresses attempting to access your site. It then will send the bad robot (usually) or hacker an authentic 403 FORBIDDEN page with a description of what the problem was. If the attacker persists, then they will be served up a permanently reccurring 503 OVERLOAD message with a 24 hour timeout. But ZBBlock doesn't do quite exactly what I want to do, it does help with other spam/script/hack blocking.

    Read the article

  • How can I pass referrer header from my https domain to http domains?

    - by nutcracker
    My website is 100% https. I have links to other http domains. The referrer header is not set when linking from a https page to a http page. From http://en.wikipedia.org/wiki/HTTP_referrer If a website is accessed from a HTTP Secure (HTTPS) connection and a link points to anywhere except another secure location, then the referer field is not sent. I would prefer that other domains can see the referrer so that they know that traffic comes from my domain. Is there a way to force this header or is there another solution? Update I've done some basic testing using a redirect: http page -- link to http --> 301 redirect --> http page = referrer intact https page -- link to https --> 301 redirect --> http page = referrer blank https page -- link to http --> 301 redirect --> http page = referrer blank https page -- link to http --> 302 redirect --> http page = referrer blank The referrer is lost when linking from a https page to a http redirect page on my own domain. So there is no referrer on the redirect.

    Read the article

  • Basic Google Analytics Click Tracking and/or Overview

    - by Alan Storm
    This is a really basic Google Analytics question. Apologies in advance if it's not appropriate here, but I've had a lot of luck on Stack Overflow and this seems like the best Stack Exchange site for a question like this. I'm trying to understand how Google Analytics goals work, or if they're the right feature to be using for my situation. Most of the documentation I find online refers to the old version of the UI, not the new one. I have a website, let's call is blog.example.com. This website drives traffic to an ecommerce store, let's call that store.example2.com. I want to get reports on which links from blog.example.com are being clicked through leading to store.example2.com. How do you do this in Google analytics? Are goals the right area to be looking? Do I setup the goals on store.example2.com or blog.example.com? Or both? Is there any canonical user guide (free or paid) that covers how this works? I'm a competent programmer, but it's years since I dealt with conversion tracking on any serious level, and we've progressed well beyond my frozen caveman pixel tracking knowledge. Thanks in advance

    Read the article

  • How can I tell GoogleBot that a subdirectory is now a subdomain?

    - by cwd
    I had about a million pages of a catalog indexed under a subdirectory, and now that's moved to a subdomain. GoogleBot is crawling each one of them and getting a 301 redirect to the new location. Even though I have set up the redirect rule in the apache sites-enabled configuration file, (i.e. it's early on when apache does the redirect - PHP is not even getting loaded), even though I have done that, the server isn't handling the load well. GoogleBot is making around 5 requests per second, and on top of my normal traffic that is hiking up the CPU for a few hours at a time. I checked in Webmaster Tools and the corresponding documentation for a way to let Google know that the content had been moved from a subdirectory to a subdomain, but with little luck. Basically the most helpful thing I saw said to just send 301 headers for the new location. How can I tell GoogleBot that a subdirectory is now a subdomain? If that is not an option, how can I more efficiently send 301 redirects out for a particular subdomain? I was thinking perhaps the Nginx server but I'm not sure that I can run both Apache and Nginx side by side on port 80 for different subdomains.

    Read the article

  • How do I trust an off site application

    - by Pieter
    I need to implement something similar to a license server. This will have to be installed off site at the customers' location and needs to communicate with other applications at the customers' site (the applications that use the licenses) and an application running in our hosting center (for reporting and getting license information). My question is how to set this up in a way I can trust that: The license server is really our application and not something that just simulates it; and There is no "man in the middle" (i.e. a proxy or something that alters the traffic). The first thing I thought of was to use with client certificates and that would solve at least 2. However, what I'm worried about is that someone just decompiles (this is build in .NET) the license server, alters some logic and recompiles it. This would be hard to detect from both connecting applications. This doesn't have to be absolutely secure since we have a limited number of customers whom we have a trust relationship with. However, I do want to make it more difficult than a simple decompile/recompile of the license server. I primarily want to protect against an employee or nephew of the boss trying to be smart.

    Read the article

  • Middle tier language for interfacing C/C++ with db and web app

    - by ggkmath
    I have a web application requiring a middle-tier language to communicate between an oracle database and math routines on a Linux server and a flex-based application on a client. I'm not a software expert, and need recommendations for which language to use for the middle-tier. The math routines are currently in Matlab but will be ported to C (or C++) as shared libraries. Thus, by default there's some C or C++ communication necessary. These routines rely on FFTW (www.fftw.org), which is called directly from C or C++ (thus, I don't see re-writing these routines in another language). The middle tier must manage traffic between the client, the math routines, and the Oracle database. The client will trigger the math routines aynchronously, and the results saved in the db and transferred back to the client, etc. The middle-tier will also need to authenticate user accounts/passwords, and send out various administrative emails. Originally I thought PhP the obvious choice, but interfacing asychronously multiple clients with the C or C++ routines doesn't seem straightforward. Then I thought, why not just keep the whole middle tier in C or C++, but I'm not sure if this is done in the industry (C or C++ doesn't seem as web-friendly as other languages). There's always Jave + JNI, but maybe that introduces other complications (not sure). Any feedback appreciated.

    Read the article

  • Automated backups for Windows Azure SQL Database

    - by Greg Low
    One of the questions that I've often been asked is about how you can backup databases in Windows Azure SQL Database. What we have had access to was the ability to export a database to a BACPAC. A BACPAC is basically just a zip file that contains a bunch of metadata along with a set of bcp files for each of the tables in the database. Each table in the database is exported one after the other, so this does not produce a transactionally-consistent backup at a specific point in time. To get a transactionally-consistent copy, you need a database that isn't in use.The easiest way to get a database that isn't in use is to use CREATE DATABASE AS COPY OF. This creates a new database as a transactionally-consistent copy of the database that you are copying. You can then use the export options to get a consistent BACPAC created.Previously, I've had to automate this process by myself. Given there was also no SQL Agent in Azure, I used a job in my on-premises SQL Server to do this, using a linked server configuration.Now there's a much simpler way. Windows Azure SQL Database now supports an automated export function. On the Configuration tab for the database, you need to enable the Automated Export function. You can configure how often the operation is performed for you, and which storage account will be used for the backups.It's important to consider the cost impacts of this as well. You are charged for how ever many databases are on your server on a given day. So if you enable a daily backup, you will double your database costs. Do not schedule the backups just before midnight UTC, as that could cause you to have three databases each day instead of one.This is a much needed addition to the capabilities. Scott Guthrie also posted about some other notable changes today, including a preview of a new premium offering for SQL Database. In addition to the Web and Business editions, there will now be a Premium edition that has reserved (rather than shared) resources. You can read about it all in Scott's post here: http://weblogs.asp.net/scottgu/archive/2013/07/23/windows-azure-july-updates-sql-database-traffic-manager-autoscale-virtual-machines.aspx

    Read the article

  • guidline for promoting a web forum or portal

    - by Hafiz
    I am a web application developer, have developed a lot of sites, portal and apps for clients. Now I want to have own such sites that with which I can do business. But I don't know what are steps to do so. What I know is make a site or portal. But what after that? There are lot of people having so much traffic on sites built with some simple open source. Many of them are forums. I also wanted to start a forum and a web portal. I can develop that or can have open source. But what after that? Content entry and SEO? Is it all to promote a portal or site? Do SEO nowadays work ? or it is all about marketing and advertising? I have no idea about that so please tell what you guys suggest. thanks for every one's opinion in advance.

    Read the article

  • ArchBeat Link-o-Rama for 2012-04-12

    - by Bob Rhubart
    2012 Real World Performance Tour Dates |Performance Tuning | Performance Engineering www.ioug.org Coming to your town: a full day of real world database performance with Tom Kyte, Andrew Holdsworth, and Graham Wood. Rochester, NY - March 8 Los Angeles, CA - April 30 Orange County, CA - May 1 Redwood Shores, CA - May 3 Oracle Technology Network Developer Day: MySQL - New York www.oracle.com Wednesday, May 02, 2012 8:00 AM – 4:30 PM Grand Hyatt New York 109 East 42nd Street, Grand Central Terminal New York, NY 10017 Webcast Series: Data Warehousing Best Practices event.on24.com April 19, 2012 - Best Practices for Workload Management of a Data Warehouse on Oracle Exadata May 10, 2012 - Best Practices for Extreme Data Warehouse Performance on Oracle Exadata Webcast: Untangle Your Business with Oracle Unified SOA and Data Integration event.on24.com Date: Tuesday, April 24, 2012 Time: 10:00 AM PT / 1:00 PM ET Speakers: Mala Narasimharajan - Senior Product Marketing Manager, Oracle Data Integration, Oracle Bruce Tierney - Director of Product Marketing, Oracle SOA Suite, Oracle The Increasing Focus on Architecture (ArchBeat) blogs.oracle.com As a "third wave" of computing, Cloud computing is changing how IT organizations and individuals within those organizations approach the creation of solutions. Updated SOA Documents now available in ITSO Reference Library blogs.oracle.com Nine updated documents have just been added to the IT Strategies from Oracle library, including SOA Practitioner Guides, SOA Reference Architectures, and SOA White Papers and Data Sheets. Access to all documents within the ITSO library is free to those with a free Oracle.com membership. WebLogic JMS Clustering and Spring | Rene van Wijk middlewaremagic.com Oracle ACE Rene van Wijk sets up a WebLogic cluster that includes a JMS environment, which will be used by Spring. Running Built-In Test Simulator with SOA Suite Healthcare 11g in PS4 and PS5 | Shub Lahiri blogs.oracle.com Shub Lahiri shows how the pre-installed simulator that comes with the SOA Suite for Healthcare Integration pack can be used as an external endpoint to generate inbound and outbound HL7 traffic on specified MLLP ports. In the cloud era, let's start calling IT what it is: 'Innovation Team' | Joe McKendrick www.zdnet.com Cloud, the third great shift in 50 years of computing, presents a golden opportunity for IT to get out in front and lead. Thought for the Day "Why do we never have time to do it right, but always have time to do it over?" — Anonymous

    Read the article

  • Report from OpenWorld Shanghai

    - by jmorourke
    Oracle OpenWorld Shanghai 2013 was held July 22nd – 25th at the International Expo Center in Shanghai, China. The conference drew over 19,000 attendees from 44 countries. In addition, 580 CxOs attended the Executive Edge program, and 430+ partners attended the Oracle Partner Network Exchange. The conference included a number of sessions on Big Data, Business Analytics, Business Intelligence and Enterprise Performance Management delivered by Oracle, our partners and customers.  I had the pleasure to attend the conference and delivered three sessions focused on Oracle’s Hyperion Enterprise Performance Management (EPM) applications. Each of my sessions was well-attended, and in a few cases was standing room only, so there is clearly a lot of interest in the China market in EPM. The EPM and BI demo pods in the DemoGrounds at the conference also received a lot of traffic. In addition to the conference sessions I delivered, I had several meetings with customers and partners in Shanghai.These sessions and meetings I attended made clear the interest that customers in China have in improving their planning, management reporting, financial reporting, and profitability management processes. In fact, with the China Ministry of Finance now standardizing on XBRL for annual reporting across multiple agencies in China, there is a great opportunity here for our disclosure management application. One interesting finding is that the China market may not be ready for cloud-based applications as many companies are state-owned and have security concerns, so on-premise applications are likely to see continued demand.  For more information about the Oracle OpenWorld China 2013 conference, please check the web  site:  http://www.oracle.com/events/apac/cn/en/openworld/index.htmlAnd don’t forget, Oracle OpenWorld San Francisco 2013 is just around the corner in September of 2013. Please check the web site for registration and content information: http://www.oracle.com/openworld/index.html

    Read the article

  • pros-cons of separate hosting accounts versus using addon domain

    - by hen3ry
    Folks: For historical reasons, I have "Site A" on "Hosting Account A", and "Site B" on "Account B", totally independent accounts with the same vendor, Bluehost. Both are primary domains. Now that Hosting Account B is just about to expire, I'm considering letting it disappear and moving Site B to an Addon domain on "Account A". Both sites are non-commercial, narrow-interest, very-low-traffic, hundreds of page views per month. The file weights for the sites are non-trivial, especially as I like to install specialized CMSs in subdomains. Since Bluehost allows unlimited hosting space there should be no issue with the file load, except I've seen hints of an issue with total file count, maybe 50k files -- which I'm not currently close to hitting, but might eventually. My question: what are the pros and cons of using separate accounts versus hosting Site B as an addon domain? Obviously, using a single account is cheaper by half, and I know that my authoring environment (DreamWeaver CS5) complains when it detects nested source trees, telling me "Synchronization" might fail in such cases, but I don't depend on this feature. What other factors should I consider? TIA

    Read the article

  • Can I use a 302 redirect to serve up static content from a url with escaped_fragment?

    - by Starfs
    We would like to serve up seo-friendly ajax-driven content. We are following this documentation. Has anyone ever tried to write a 302 redirect into the htaccess file, that takes the '?_escaped_fragment=' string and send that to a static page? For example /snapshot/yourfilename/ How will Google react to this? I've gone through the documentation and it's not very clear. The below quote is from Google's documentation this is what I find. I'm not sure if they are saying that you can redirect the _escaped_fragment_ url to a different static page, or if this is to redirect the hashtag URL to static content? Thoughts? From Google's site: Question: Can I use redirects to point the crawler at my static content? Redirects are okay to use, as long as they eventually get you to a page that's equivalent to what the user would see on the #! version of the page. This may be more convenient for some webmasters than serving up the content directly. If you choose this approach, please keep the following in mind: Compared to serving the content directly, using redirects will result in extra traffic because the crawler has to follow redirects to get the content. This will result in a somewhat higher number of fetches/second in crawl activity. Note that if you use a permanent (301) redirect, the url shown in our search results will typically be the target of the redirect, whereas if a temporary (302) redirect is used, we'll typically show the #! url in search results. Depending on how your site is set up, showing #! may produce a better user experience, because the user will be taken straight into the AJAX experience from the Google search results page. Clicking on a static page will take them to the static content, and they may experience avoidable extra page load time if the site later wants to switch them to the AJAX experience.

    Read the article

  • How to SEO Optimize Javascript Image Loader?

    - by skibulk
    I am building an image-centric catalog website. It catalogs collectible gaming cards numbering 100,000+ pages. Competitor sites recieve millions of hits each month, so with the possibility of excessive traffic, I need to moderate image bandwidth while also optimizing for image SEO. I'm looking for some tips on doing so. Each page on the site features one card with appropriate tags and descriptions. There are however four images for each card - one on matte cardstock, one on foil cardstock, one digital, and one digital foil. In a world with unlimited bandwidth and no-wait page loads, I'd simply embed all four images on the main product page with titles, alt tags, and captions to rank them according to their version keyword. In reality a javascript gallery image loader seems appropriate. Here is a simplified example of my current code. Would this affect SEO in any way? Should I be doing anything differently? Note that I don't want to create a page for each image as I'd have to duplicate the card tags and descriptions on each one, diluting PR for the main page. Thanks for any insight! <script type="text/javascript"> document.write(' <img src="thumbnail1.jpg" data-src="version1.jpg"> <img src="thumbnail2.jpg" data-src="version2.jpg"> <img src="thumbnail3.jpg" data-src="version3.jpg"> <img src="thumbnail4.jpg" data-src="version4.jpg"> '); </script> <noscript> <img src="version1.jpg"> <img src="version2.jpg"> <img src="version3.jpg"> <img src="version4.jpg"> </noscript>

    Read the article

  • Site failing randomly - could it be Cloudflare or something weird in the JS?

    - by James
    I've been working on a simple site that uses javascript to fade through some fullscreen background images as well as some other simple animations. I've tested the site on Chrome, Safari, FF and Opera on OSX, IE8+ on Win7 and Chrome & FF on Ubuntu and everything looks as I'd expect it to. However, I've had reports of the site failing to load (stops at the stage where the background fades up) on Safari and Chrome on OSX and Win. I can't replicate this on any setup so I'm finding it impossible to troubleshoot. Google's instant preview shows the site fine as does most of the options at browsershots.org so I'm really scratching my head. I'm running the site's traffic through Cloudflare and I'm wondering whether anyone can see (or knows from other sites) why Cloudflare might be mangling the JS or causing a problem somehow (I don't get any errors in the JS error console). Of course, if you can replicate the problem on your machine and can suggest an area to look at that would be amazing but I'm hoping that, like me, you don't see any problem with the site! Here's the site: http://www.bighornrevelstoke.com Thanks, James

    Read the article

  • Analytics - Where do my drop offs go?

    - by BadCash
    I have a website set up with Google Analytics (through the Wordpress plugin "Google Analytics for WordPress" by Joos de Valk). When I check out the visitors flow in Google Analytics, it shows something like this: (home) - 43% drop-offs /page-2/ - 10% drop-offs ... etc ... I have also set up events for external links. My main "goal" of the website is to drive traffic to my Android app on Google Play, so I have a couple of different links to that that are all set up as events. Everything seems to be working, my events show up when I go to Content - Events in Google Analytics. However, it seems to me that some percentage of the users that are reported as "drop-offs" in fact have clicked on one of the external links. But there's no info about the reason of those drop-offs in the Visitors flow-chart. I can of course check out each specific event category, event action and set "other" to Content/Page, which (I guess) shows the number of visitors who triggered a specific event on a specific page. It just seems like such a complicated way of going about this! So, is there a way to get a more detailed picture, including events, in the Visitors flow chart? Something like: (home) - 43% drop-offs Event Action: "Google Play"=50%, "Youtube"=10%, (not set)=40%

    Read the article

  • Additional new material WebLogic Community 2013

    - by JuergenKress
    Load Balancing T3 Initial Context Retrieval for WebLogic using Oracle Traffic Director Demystifying WebLogic and Fusion Middleware Management WebLogic Server- Integrated & Optimized w/ Best of Breed Oracle Offerings to Turbo Charge your Applications Get a Bird’s-Eye View of IT Architecture: IT Strategies from Oracle IT Strategies from Oracle, a complimentary authorized library of guidelines and reference architectures, can help you put together a strong IT architecture that takes into account individual technology components as well as big-picture IT concepts and strategies. Read More. Deploying Oracle Application Development Framework Applications on Oracle Java Cloud Service and Oracle Database Cloud Service With the new Oracle Cloud environment you no longer have to maintain an Oracle WebLogic server or a database server of your own – you can instead use instances hosted on Oracle Cloud. More Oracle Application Development Framework Development with Eclipse Oracle Enterprise Pack for Eclipse now provides even more Oracle Application Development Framework tooling with each release. Check out this new tutorial on Oracle Enterprise Pack for Eclipse 12.1.1.2. Oracle WebLogic Devcast Series Join us for the March 28 Oracle WebLogic Devcast Webcast, “What to Expect from Maven on Oracle WebLogic,” featuring Pyounguk Cho, Oracle’s principal product manager. Learn what developers can expect when utilizing Apache Maven with Oracle WebLogic. Customer Webcasts: WebLogic Devcast Series – Register Leveraging Third-Party Libraries to Create and Deploy Applications to Oracle Cloud Oracle ADF: Tuning Application Module Pools and Connection Pools WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • Hosting and domain registrations for multiple clients under a single hosting account of mine?

    - by letseatfood
    I am finally getting regular work designing, developing, and deploying websites for small businesses and individuals. So far the websites utilize single-user content management systems, so the websites create, as far as I know, minimal load on the shared servers. I have always required that each of my clients purchase annual shared hosting at Dreamhost. For domain registration, I ask that they register with Dreamhost, but some already have a registered domain elsewhere and this is fine with me. I do this so the billing issues are the client's responsibility, not mine. My question is: Since I can register unlimited domains and connect them to my one shared hosting account at Dreamhost, should I not be requiring clients to individually pay for shared hosting and a domain? Should I actually be paying for one hosting account and then hosting all of my client's websites on that account? As I said before, I currently have each client buy their own hosting, because I feel that, for example, if there is high traffic to their site, there would be less a chance of the site going down than if their site was hosted with many others on one account. I am famous for being long-winded, please let me know if I can clarify at all. Thanks!

    Read the article

< Previous Page | 139 140 141 142 143 144 145 146 147 148 149 150  | Next Page >