Search Results

Search found 11896 results on 476 pages for 'smart pro'.

Page 232/476 | < Previous Page | 228 229 230 231 232 233 234 235 236 237 238 239  | Next Page >

  • E-Commerce Website

    - by haargott
    I am planning to create an e-commerce website for users to buy products and services. In this website I want users to register and also participate in something like a browser game, where every user may receive some questions which they have to answer. For each question they successfully answered, they receive points and the number of collected points will decide on which rank they are. Edit 2 Currently I am considering using only HTML, CSS, JavaScript, PHP, SQL to design this e-commerce website. Together with this I was thinking about learning jQuery as it may help me, but I am not sure if I should code everything specifically by myself or just use the library to make it faster. 1) Could you tell me if those languages are sufficient enough for creating such a website described? 2) Could you tell me what kind of free software tools and frameworks are most appropriate to use when creating this e-commerce website?

    Read the article

  • Google analytics not provided for 55% of total traffic

    - by Neolisk
    I've been here and here to learn what (not provided) means. Now the question is if what I am seeing in my Google Analytics stats for my website is considered normal (and whether I can/should do anything about it). Here are the statistics from one day, but other days are similar: 102 visits, 57 is from (not provided), that's over 55% of unknown keywords. Is it normal to have it like that? Does google plan to do anything about it? In other words, what's the perspective? In my understanding, with this approach, as people switch to https, Analytics will stop being useful. Please correct me if I am wrong in my assumptions.

    Read the article

  • Multilingual sites and Google search results, do subfolders really work?

    - by AWinter
    About three months ago we added an English version of our, previously Japanese only, site http://www.clubberia.com under the subfolder http://www.clubberia.com/en/ we've tried to follow the sometimes incomplete best practices laid out by Google by adding alternate tags to all pages that are currently translated. The top page for instance has the following meta tags for language. <link rel="canonical" href="/"> <link rel="alternate" hreflang="ja" href="/"> <link rel="alternate" hreflang="en" href="/en/"> While the English main page under /en/ has <link rel="canonical" href="/en/"> <link rel="alternate" hreflang="ja" href="/"> <link rel="alternate" hreflang="en" href="/en/"> We also have these alternate languages setup in the sitemap. (as per Google's recommendations) http://www.clubberia.com/sitemap.xml It seems however that Google absolutely refuses to show the English top page in results when the user is using English at google.com if you search for "clubberia" you'll, as of this post, get the Japanese description and a title that Google has apparently invented instead of the title and description in the meta-tags for the /en/ index page. Does anyone have any experience with subfolders actually working to affect search results? Am I being too impatient, or possibly doing something incorrect? Should we just give up on subfolders and push to subdomains (not the prettiest option)?

    Read the article

  • Is it possible to track an outsourced adwords campaign?

    - by Addsy
    We have a site that is paying a 3rd party to run adwords campaigns on our behalf. They do this using their own Adwords account and so we are unable to link it to our Google Analytics account. We do have our own Adwords account for the site which is linked but the majority of campaigns are being run by this 3rd party. At the moment, in our Adwords reports (ie the Adwords section within Google Analytics) we are seeing that for the majority of visits the campaign and keyword data is not set. I am assuming that this is because even though the visit is being recognised as coming from an Adwords account, because it is not the Adwords account linked to our Analytics account, we don't get sent the campaign and keyword information. Is this correct and if so is there a way that we can start tracking this information so that we can see for ourselves how the campaigns are performing? Many thanks

    Read the article

  • Subdomains on WampServer

    - by MohamedKadri
    I'm working on WampServer for development, I've set up the domain tuniguide.local and it works fine with this configuration: DocumentRoot "D:\www\tuniguide" ServerName tuniguide.local But when I wanted to add a subdomain fr.tuniguide.local I get a 404 Not Found with this configuration: DocumentRoot "D:\www\tuniguide\fr" ServerName fr.tuniguide.local It gives me this message: The requested URL /www/tuniguide/index.php was not found on this server. Is there someting that I missed? Thanks.

    Read the article

  • Visual sitemap generater

    - by rugbert
    Im looking for a something to visually create a sitemap for one of my websites. Id like something in a tree structure, so I have the hierarchical view of my site. A couple requirements I have tho, the ability to map password protected pages, and (not REALLY a requirement) the ability to integrate google analytics data. Im trying a evaluation version of powermapper, but the version that includes analytics integration is like $300 so Im looking for something cheaper.

    Read the article

  • Can Adwords be cancelled by Google because of improper IE6 site rendering

    - by user745434
    A client just got a notice from Google saying that their Adwords campaign has been put on hold because the site is: Improperly rendering or Under constructions or Needs a special program to run Now the site is improperly rendering on IE6. On everything else, including IE7+ it's fine. If this is the issue, would putting up a "Looks like you're using an older browser" message instead of the site for IE6 be a solution? Or must the site look good in IE6 for the Adwords campaign to continue?

    Read the article

  • Is it possible to track redirects to external sites from our subdomains?

    - by ChaBuku
    I have a handful of subdomains set up as redirects because we are using them for QR codes. I want to be able to track the QR code redirects (which are already set up and printed so no changing them at this point) and see the effectiveness of each. Here's two examples: http://qr.glorkianwarrior.com and http://ad.glorkianwarrior.com are set up to forward to our iTunes page (later on this year it may forward to Google Play or a specific landing page), is there any way on my server to track the redirect from the subdomain to iTunes and see where traffic is coming from first? I have the redirects set up through cPanel presently using subdomains. Edit: From the research I've seen I can't track a 301 directly. If I redirect to an internal page and then do a timed redirect to the iTunes link, how long will it take for the tracking script to track a hit?

    Read the article

  • Good links somehow being converted to ones with a PHP redirect (not a virus)

    - by Rebecca
    This has happened to links we put on web pages and in emails. We might put www.oursite.org/work/ but when I view source it shows up as webmail.ourhosting.ca/hwebmail/services/go.php?url=https%3A%2F%2Fwww.oursite.org%2F%2work%2F This ends up at the webmail login page for our web host. But only some of the people who click the link get the login page; others go directly to the original page we intended. We don't want it to go to the webmail login page, nobody needs to log in to our web site. This occurs for links to pages on our site, but also to links to other sites that we put in emails or in posts. It seems to be browser independent as well as e-mail client independent as we variously have used Firefox and Chrome as well as MS Outlook and Thunderbird. I've tried to resolve the issue with our webhost but they keep telling me they don't support our browser, or our email client (i.e., they don't understand the issue). At the moment, our only option is to try another web host just to get rid of their login. Any ideas about what's going on?

    Read the article

  • To change url to user friendly url

    - by German
    I'm re-factoring my asp.net application from asp.net 3.5 to 4.0. Also I'm changing url to user friendly url. Example /product.aspx?id=100 to /product-name/100 All my pages indexed by search engines and the site already 6 years online. I'm planning to do 301 redirect from old pages to new one. I want to make sure I won't loose the rank and traffic. Any suggestion how to do it properly?

    Read the article

  • Google Analytics: Do unique events report as unique visits when triggered on pages other than your own domain?

    - by Jesse Gardner
    We just recently attached a SWF to our Brightcove video player to report various events back to Google Analytics. We're also tracking page views with a standard GA snippet on the page where the player is embedded. As I understand it, because a unique has already been recorded for the page, any event being triggered by the player is getting associated with that unique. However, we allow people to embed the video player on other websites. All of the event data started pouring into the Events section as expected, but we noticed a dramatic uptick in unique visitors on the site (nearly double) while the pageview count stayed relatively unchanged. Disabling event tracking brought the traffic back down to average levels. I should also add that in the Pages section of Event tracking we're seeing URLs for other sites where the player has been embedded; but this data isn't showing up in the Content section. It seems counterintuitive, but does GA count an event fired as a unique visit even if it's triggered from some place other than your website? Is so, there any way to trigger an event in the events section without it reporting to the unique visitor count?

    Read the article

  • How to calculate maximum number of request in 128 MB VPS performance?

    - by ifdion
    I am a newbie here, please let me know if I'm using wrong webmaster terms. I am currently setting up a VPS for a multi site WordPress. The VPS uses Debian 6 LNMP setup and the DNS is being taken care by another service. Currently the VPS is running non multi site WordPress with -+ 83 MB RAM out of 128MB. As far as I know the performance is relative to the number of request, not the number of sites in the multi site setup. The question How do I calculate maximum number of request in with that setup? If the information is not enough, what other factor do I need to know? Thank you in advance.

    Read the article

  • Self-censorship of our search results

    - by user5261
    We run a small search engine and have recently been notified of a number of hate related links in our results that would upset a significant proportion of our users. Our first instinct is to summarily remove these results, but I'm concerned that this makes us little better than the oppressive regimes that censor the web. Where does one draw the line and how might one justify removing results that we deem offensive?

    Read the article

  • Can't get Rewrite rule to keep original URL

    - by user38100
    I have these Rewrites, but I would like to have the URL stay the same as what is typed originally, I thought removing the [R] flags would stop it but it hasn't RewriteCond %{HTTP_HOST} ^examplea\.example\.com$ [NC] RewriteRule (.*) http://examplea.example.com:32400/web [L] RewriteCond %{HTTP_HOST} ^exampleb\.example\.com$ [NC] RewriteRule (.*) http://exampleb.example.com:9091 [L] Edit: would this work better? RewriteCond %{HTTP_HOST} ^hello.example.com$ RewriteRule ^(/)?$ welcome [L]

    Read the article

  • How to Program AWS Spot Instances to Strategically Bid So the Auction is Never Lost Until a Competitor Beats the Maximum I'm Willing to Pay?

    - by Taal
    I believe I'm in the right section of stack exchange to ask this. If not, let me know. I only use Amazon Web Services for temporary type hosting services, so the spot instances are quite valuable to me. I would also just make an instance and start and stop it - but - that doesn't necessarily fit my bootstrapped budget sadly. Anyways, it really kills me when someone outbids me on a spot instance I have (I tend to go for the larger ones which there are fewer of available) and I get randomly kicked off. I know or at least I believe there is a way to program in something somehow to dynamically change your bidding price to beat a potential competitor's if their's is higher than yours. Now, I previously believed Amazon would just charge me for the highest price right above the next lowest competitor automatically (eliminating the need for this) - so if I bid too high, then I only pay what I would of needed to in order to win and keep the auction. Essentially, I thought my bid price was my max bid price. Apparently, according to my bills and several experiments I've done - this is not the case. They charge me for whatever I bid even when I know there is no one else around to counter bid me. I needed to clarify that, but let me get back to the main point: Let's say I'm bidding $0.50, competitor comes in and bids 0.55 cents. I get kicked off. I want to have it to where I'd set a maximum I'm willing to pay (let's say $1.00 here), and then when competitor comes in and tries to bid $.55, my bid is dynamically adjusted to beat his at $0.56 up until he breaks my $1.00 threshold. I've been reading the guides and although they are more or less straightforward, I feel like they leave a few holes in them that end up confusing me. Like, for instance, where do I input said command or when do I do it? Maybe I'm just tech illiterate and need help deciphering these guides. A good start for someone willing to answer/help me decipher this problem would be here: http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/spot-as-update-bid.html

    Read the article

  • What is the average page size for single page application (SPA)? [on hold]

    - by Emmanuel Istace
    I'm developing a single page application with a lot of css & javascript. For now the page is 1.3Mo composed by 5 section. Here are the rounded stats : Document : 10kb Style : 60kb Images : 450 kb (already compressed, include a big gallery thumbnails) Javascript : 700kb - 600kb of "framework" (jquery, jquery-ui, boostrap, modernizer, waypoint, ...) and 100kb of custom js. Fonts : 125kb And the site is not finished yet. (Will include gmap api, and some others...) My questions are : Do you have any statistics about the average weight of an SPA? As this is the whole website, do you think it's acceptable? Is lazy load (for images) a solution? What will be impact for SEO ? Is the "200kb rule" of google still relevant? Do you know great tools to detect which javascript code is not used during the the exection of a page and then the availability to optimize these 700kb of framework js stuffs? Can a caching strategy be an answer?

    Read the article

  • Domain in PENDINGDELETE, question about drop

    - by kcdwayne
    A domain I want is in the pendingDelete stage according to WHOIS. I have been monitoring it since redemptionPeriod, and it entered into pendingDelete 5 days ago today. After checking a few services (SnapNames, etc), they report it is scheduled to drop on the 11th (7 days, by my calculations). I'm not quite sure what to believe. The domain isn't highly valuable. It is to me and one other company. I can see no backorders placed on the big name sites, so I'm thinking of trying to get it without a backorder service. Any insight as to when it will actually drop? I've read 11AM-2PM PST, but I'm unsure. Thanks.

    Read the article

  • Embeding a generic google search with autocomplete - not a custom site search

    - by picxelplay
    Most people's home page is google.com. My homepage is just a custom html page hosted on my computer. I do this because I am a web developer, and I have several projects that I work on a one time, so I like to have quick links to all of them. On that page I usually just have a Link to google.com for when I want to search. But below all of my quick links, I want to add a google search box (with Autocompletions). I first used a simple iframe to embed google.com into the page, but then my search results were confined to that iframe. I wanted to search for something, then my results would open in a new tab. I then came across this code snippet but it doesn't have Autocompletions: http://www.refactory.org/s/google_search/view/2 How can I add Autocompletions to this? Or is there a better way of doing it? Thanks in advance for any advice

    Read the article

  • 302 Redirect Issue for Joomla 2.5.7 version site

    - by DDD
    For my site i am using Joomla 2.5.7 version and FB comments tools for the articles in the site. i am getting the 302 redirect problem for the FB comments for the Articles to which i post. I have checked the url's here http://www.webconfs.com/http-header-check.php and got the following result with 302 redirect. for http://www.fijoo.com HTTP/1.1 302 Moved Temporarily = Date = Wed, 21 Nov 2012 09:46:39 GMT Server = Apache/2.2.22 (Unix) mod_ssl/2.2.22 OpenSSL/1.0.0-fips mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 mod_perl/2.0.6 Perl/v5.10.1 X-Powered-By = PHP/5.3.16 Set-Cookie = =en-GB; expires=Wed, 21-Nov-2012 10:46:40 GMT LOCATION = / Content-Length = 0 Connection = close Content-Type = text/html How to overcome this anyone please help.

    Read the article

  • Need a CDN with SSL

    - by Till
    We currently use Edgecast through Speedyrails. Back when I did my research they were both fast and very cost-effective. I haven't looked in a while, but now we need SSL on our assets as well. I reached out to our current provider and they want a setup fee and something like 260 USD per host per month (we use multiple hosts currently). I looked at AWS Cloudfront and it seems the most cost affective way to get SSL, but it's not a custom domain then (e.g. cdn.example.org), which I could live with. Has any else researched this lately and has any providers to get in touch with - can be resellers or direct buys. I'm not looking for a bargain, I just want to get an idea what these things cost. Edit, 2012-08-23: Must have is custom origin. E.g. I don't want to manually upload files somewhere else. Edgecast and Cloudfront both support this.

    Read the article

  • Why is Firefox changing the color calibration of this image?

    - by eoinoc
    The symptom of my problem is that the same hex color in a PNG image does not match the CSS-defined color defined by the same hex code. This problem only happens in Firefox when gfx.color_management.mode is set to 2 (tagged images only) rather than 0 (off). (Firefox ICC color correction described here). The image is http://dzfk93w6juz0e.cloudfront.net/images/background-top-light.png which at the bottom has the color #c8e8bd. However, the shade of green is different to that color when Firefox color calibration is enabled. Is this image inadvertently "tagged" for color correction?

    Read the article

  • Is using HTML entities (for language-specific characters) in UTF-8 necessary?

    - by Drachenzauberei
    As in the subject-line. Saw the situation the other day on a page which felt weird to me. Except for markup-delimiting characters such as pointy brackets or the ampersand, escaping, say, German umlauts shouldn't be necessary, should it? Checked the encoding server-side, in-page and by way of HTTP headers, looks completely UTF-8 to me. What's your take on this and do you reckon it could adversely affect SEO or SERP placement?the page

    Read the article

  • Why do some user agents have spam urls in them (and why are they always Opera/Presto User-Agents)?

    - by Erx_VB.NExT.Coder
    If you go to (say) the last 100 entries (visits) to the botsvsbrowsers.com website (exact link, feel free to take a look: http://www.botsvsbrowsers.com/recent/listings/index.html ), you'd notice that almost every User Agent that has the keywords "Opera" and "Presto" inside them, will almost certainly have a web link (URL/Web Address) inside it, and it won't just be a normal web address, but a HTML anchor tag/link to that address. Why is this so, I could not even find a single discussion about it on the internet, nowhere, I tried varying my search terms many times. If the user agent contains the words "Opera" and "Presto" it doesnt mean it will have this weblink, but it means there is about an 80% change that it will. A typical anchor tag/link inside a user agent will look like this: Mozilla/4.0 <a href="http://osis-uk.co.uk/disabled-equipment">disability equipment</a> (Windows NT 5.1; U; en) Presto/2.10.229 Version/11.60 If you check it out at the website, http://www.botsvsbrowsers.com/recent/listings/index.html you will notice that the back and forward arrows are in there unescaped format. This isn't just true for botsvsbrowsers, but several other user agent listing sites. I'm really confused and feel line I'm in a room full of 10,000 people and am the only one seeing this ghost :). If I'm doing statistical analysis, should I include or exclude this type of user agent from my listing (ie: are these just normal users who've set their user agents to attempt to drive some traffic to their sites as they browser the web), or is there something else going on? The fact that it is so consistent in terms of its format leads me to believe that it is an automated process (the setting or alteration of the user agent) so I cannot decide or understand the process by which this change is made (I know how to change a user agent), but unsure which program or facility is doing this, especially since it is exclusive to Opera (Presto) user agents that are beyond I think an 8 or 9 point something browser version. I've run some statistical tests, parsing entries from all over the place, writing custom programs, to get a better understanding of this. Keep in mind that I see normal URL's in user agents infrequently, they are just text such as +http://www.someSite.com appended to a user agent normally, especially if its a crawler or bot it provided its service URL, this is normal and isnt done with an embedded link (A HREF=) etc, so I'm not talking about "those".

    Read the article

  • QR Codes and Short Links - Please Take A Look [closed]

    - by Joe Turner
    I'm looking for a way to create a QR Code and a shortened link when a form is submitted. I have the QR Code bit, but the link is too long for me and the QR Code looks scary and complicated. The way it works is; the user types in (in this instance) a contract number. Then, a folder is created on the server of that contract number. (www.mysite.com/QR/$contractnumber). Then, using PHP again, I create a QR Code through Google because I know that every QR code will be linking to the same place, just a different ending of the link. The only bit that changes is the $POST... I was wondering if there was a way to shorten the link before it goes to Google? It would have to be through php. The user enters the contact number in the form, then that number(usually around 5/6 digits) will be entered into a already existing command? I'm not an expert in anything, I just know some really random snippets of code... And HTML and CSS, of course. Any help would be appreciated and judging by the few days I have been searching this, I think it might help a few people in the future. I would also like to confirm that the solution can't be one of this visual URLShorteners. If it is, it just needs to be the back-end of it, built into a existing form and QR Generator. Simple?

    Read the article

  • Google Site Search (commercial) not indexing files in sitemap

    - by melat0nin
    I have a client for whom we have purchased Google Site Search. It works well for HTML pages served by the CMS, but files aren't being reliably indexed. I wrote a script to generate an XML feed (sitemap) of all the files in the CMS which I've plugged in to Google Webmaster Tools for the site. It says that for that sitemap 923 URLs have been submitted, but only 26 have been indexed. The client relies heavily on searching within files, which is why we decided to use Google search, so this is a bit of a problem. Many of the files aren't linked to from any page on the site, as they are old and therefore don't merit having a page of their own. But they still need to be accessible through search for archiving purposes. The file archive xml can be found at www.sniffer.org.uk/file-archive and the standard xml sitemap (of pages) can be found at www.sniffer.org.uk/sitemap.xml. Any thought would be much appreciated!

    Read the article

< Previous Page | 228 229 230 231 232 233 234 235 236 237 238 239  | Next Page >