Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 194/592 | < Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >

  • How to develop Online Shopping Portal Application using PHP ?

    - by Sarang
    I do not know PHP & I have to develop a Shopping Portal with following Definition : Scenario: Online Shopping Portal XYZ.com wants to create an online shopping portal for managing its registered customers and their shopping. The customers need to register themselves first before they do shopping using the shopping portal. However, everyone, whether registered or not, can view the various products along with the prices listed in the portal. The registered customers, after logging in, are allowed to place order for one or more products from the products listed in the portal. Once the order is placed, the customer gets a reference order number and the order status should be “order in process”. The customers can track their order using the given reference number. The management of XYZ.com should be able to modify the order status of a particular reference order number to “shipped” once the products are shipped to the shipping address entered by the customer at the time of placing the order. The Functionalities required are : Create the interface for the XYZ.com shopping portal using HTML/XHTML and CSS. Implement the client side validations using JavaScript. Create the tables using MySQL. Implement the functionality using the server side scripting language, PHP. Integrate all the above tasks and make the XYZ.com shopping portal functional. How do I develop this application with following proper steps of development ?

    Read the article

  • SEO and unique IDs in urls

    - by kokoko
    I have a web site who's home page is at http://domain.com when a new user first hits the web site I'm creating a unique id of 5 chars for them (for example 'abcde' and redirecting them to http://domain.com/abcde so they can later bookmark and return to their workspace. My question is what's the best approach for SEO purposes, I need the main url domain.com to be indexed but google will also get the redirect and will not index the main page. I know about canonical urls, but this applies only when the domain.com url does not redirect also should I use 301 or 302 code in the redirect ?

    Read the article

  • Managing user privileges, best practices [on hold]

    - by Loïc N.
    I'm am new to web development. I'm creating a website where different user can have different privileges, such as creating/editing/deleting a news, or adding/editing/deleting whatever kind of content on the website. I started by creating a "user type" that would indicate the user's privileges (such as "user", "newser", "moderator", "admin", and so on), but I quickly started noticing issues that made me think that this might be a naive approach to this issue. What if I want to give a regular user the right to edit a news (for whatever reason)? Then the user would be half "user", half "newser". But the system I use can only handle one user-type. So what would be the best practice here? I was thinking of removing the concept of roles (or "user-types" such as newser) and only have the concept of "privilege", where every user could have zero to many privileges. So, to re-use the above example, if I wanted a user to have the right to edit some news, I would only have to give him a "edit news" privilege. Is this the way to go?

    Read the article

  • Managing multiple Adwords accounts from one Google account

    - by CJM
    I have a Gmail account linked to numerous Analytics accounts, and a couple of Adwords accounts - that is, I can track stats from a dozen or so sites, and have administration rights for a couple of Adwords campaigns. However, a client has already set-up their Adwords account and has invited me help administer their campaign. However, when I try to accept, I get the folowing error: The Google Account xxx already has access to an AdWords account (Customer ID: ). As many have discovered, for some reason Google won't let an account that already owns an Adwords campaign, to join another account. However, I wondering if there is any workaround for this? Temporarily, I'm using a separate Gmail account for this, but what is the longer term solution. Going forwards, sometimes clients will be happy from me to 'host' their campaigns (but providing them with access), but I'm equally sure that many will want to retain greater control. Surely there must be a better way than creating an additional Gmail account for each client? How do web/SEO agencies handle this?

    Read the article

  • Apache mod_pagespeed ignores redirected pages

    - by Terra
    Using mod_pagespeed (https://developers.google.com/speed/pagespeed/module) with an Apache Server, I noticed that some pages were not being processed. The pages in question all had one similar attribute - they were "redirected" pages - for example an ErrorDocument response or an "index.html" file serving as the response when a directory is requested. Is there any way to remedy this? I've checked the FAQ and had a good trawl through the PageSpeed Documentation but to no avail.

    Read the article

  • Google analytics and Adwords showing very different figures

    - by Dave Rook
    In AdWords I have 1 advert running only. The landing page includes a querystring so I can track it. EG, www.mydomain.com/products?source=CPC I also use Google Analytics. For February I have approx 1450 clicks in AdWords. This means, 1450 went to my website. In Google Analytics, according to my landing page, there were only ~850 visits. In Google Analytics, in the Acquisition - All traffic page, it suggests that Google CPC brought 517 visits... I know tracking tools are not 100% reliable but this figure seems to be showing something is very wrong. How can I tell which of the figures is accurate or is this just a limitation of reporting tools?

    Read the article

  • Do AdWords conversions only track AdWords visitors?

    - by atticae
    I have set up an AdWords conversion and put the conversion tracking JS code on a test page. However, I don't see any conversions tracked when I visit it. Does the AdWords conversion tracking only register your conversion if you come to the site by clicking on an AdWords campaign? Google's help page advises me to test the code by clicking a campaign. However, I would like to use the tracking to track all conversions, not just AdWords. I considered using Analytics as well, but it seems you can only track via url there, not JS, which would mean I had to restructure a part of my page. (Because currently a conversion appears does not necessarily happen on a different URL.) Is AdWords tracking not a viable solution to track all visitor conversions on my site?

    Read the article

  • Is pixelmator a viable alternative for photoshop? [migrated]

    - by ChrisR
    I've always been a photoshop user, i know the ins and outs and know my way around all the tools i need for my webdesign work. But now i'm faced with a dilemma, for my new job i haven't got the budget for a full photoshop license so i'm wondering, is pixelmator a good alternative? I use Photoshop mainly to slice a design into separate images so enable/disable layers is a must, PSD compatibility too, ... Anyone has experience with Pixelmator?

    Read the article

  • js includes for html seo-friendly

    - by ascar
    I need to do client-side includes on a website for the navigation. The problem is the navigation is currently all html, and since I can't use php i've just been using js to include the html with document.write (). however, i read that this is not good for seo. While i know there are seo-friendly js navigation menus, I was wondering if there's a way to do this while including the html and not having to alter it very much? I also got it to work using embed with html which i know is super hacky but i don't really know that much about js and am frustrated being told not to use server side includes. thanks!!

    Read the article

  • Monitoring Domain Availability

    - by JP19
    How can I write a tool to monitor domain name availability? In particular, I am interested in monitoring availability of a domain which is in PENDINGDELETE (or REDEMPTIONPERIOD or REGISTRY-DELETE-NOTIFY or PENDINGRESTORE or similar ) status after its expiration date. Any suggestions or more information about the PENDINGDELETE and similar status are also welcome (what is the time frame till which it can remain in this status, etc. I usually don't see a fixed pattern or even consistent correlation with expiration date and this status).

    Read the article

  • Scripts Casing Flash Intro Animation To Stop [migrated]

    - by ubique
    When my Flash website loads, it freezes halfway through the initial animation for 2-3 seconds and then continues. This obviously doesn't look great and I can't figure out what is causing it. Am thinking it is one of the scripts in index.html causing the issue and have tried all sorts of ways to correct it - what have I done wrong? <!DOCTYPE html> <html lang="en"> <head> <title>company name</title> . . . <link href="style.css" rel="stylesheet" type="text/css" /> <script type="text/javascript" src="js/flashobject.js"></script> <!--[if lt IE 7]> <link href="ie6.css" rel="stylesheet" type="text/css" /> <![endif]--> </head> <body> <header> <hgroup> <h1>company</h1> <h2>company</h2> </hgroup> </header> <div id="container"> <div id="head"> <div class="aligncenter"><a href="http://www.adobe.com/go/EN_US-H-GET-FLASH"> <img src="http://www.adobe.com/images/shared/download_buttons/get_adobe_flash_player.png" alt="" /></a> </div> </div> </div> <div class="g-plus" data-href="https://plus.google.com/100925740920754223119?rel=publisher" data-width="170" data-height="69" data-theme="light"> </body> <!-- Flash --> <script type="text/javascript"> var fo = new FlashObject("main_v10.swf", "head", "100%", "100%", "8", ""); fo.addParam("quality", "high"); fo.addParam("allowFullScreen", "true"); fo.write("head"); </script> <!-- Hello Bar --> <script type="text/javascript" src="//www.hellobar.com/hellobar.js"></script> <script type="text/javascript"> new HelloBar(39040,52484); </script> <!-- GPlus --> <script type="text/javascript"> window.___gcfg = {lang: 'en'}; (function() {var po = document.createElement("script"); po.type = "text/javascript"; po.async = true;po.src = "https://apis.google.com/js/plusone.js"; var s = document.getElementsByTagName("script")[0]; s.parentNode.insertBefore(po, s); })();</script> <!-- Google --> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-xxxxxxxx-1']); _gaq.push(['_setSiteSpeedSampleRate', 10]); _gaq.push(['_trackPageview']); (function init() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga,s); })(); window.onload = init; </script> </html>

    Read the article

  • DNS name server error

    - by Danny
    I am getting DNS error on google webmaster tools. And even after testing with this http://dnscheck.pingdom.com/?domain=ansoftsys.com&timestamp=1372108107&view=1 Name Server details Here is a screenshot my DNS management page How to solve this issue? And my DNS error image is below generated from this link http://dnscheck.pingdom.com/?domain=ansoftsys.com&timestamp=1372108107&view=1

    Read the article

  • Ars Technica .ars URL suffix -- Vanity or SEO Benefit?

    - by yc01
    The Technology website Ars Technica has adjusted their URL rewrite rules to end with a .ars. Traditionally, sites have taken advantage of this URL rewriting capability to completely eliminate file suffixes like .html, .php, .aspx etc, under the theory that this made for better SEO (since the content of the URL was more relevant to the content) Ars Technicas, though, look like this: http://arstechnica.com/science/news/2011/03/flow-from-the-poles-drive-sunspot-levels.ars So, is Ars Technica adding the .ars file suffix purely a vanity play? Or is it an SEO trick to improve the site's SEO by cleverly inserting their site name into every URL slug? And, if this is indeed an effective SEO trick, should other sites follow suit?

    Read the article

  • Mod Rewrite - url rewriting

    - by modrewriteNewbie
    I am very new to mod rewrite. I need to redirect any user with "citzenhawk" parameter in their url to my url for example http://www.mywebsite.com/?sc=CX12N003&cm_mmc=affiliate--citizenhawk--nooffer-_-na&prfc=5&clickid=0004c845fa9a87050a4277221a003262 should result into http://www.mywebsite.com/ Here are my rewrite conditions: RewriteCond %{QUERY_STRING} (&|^)cm_mmc=(.)citizenhawk(.)(&|$)$ RewriteRule ^/rrs/ [NC,R=302,L] Where am I going wrong? Is my RewriteCond wrong?

    Read the article

  • Is SimplePHPBlog a secure blogging engine?

    - by authentictech
    Has anyone used the blog engine SimplePHPBlog? It is a simple blog engine that uses only text files (no database). My problem with it is that the content directory where the texts files are stored appears to require being world writeable/readable (i.e. permission 777) for it to work. This means anyone can access the text files with a browser! These text files include the blog/comment poster's IP and email address! This is not secure or good practice, right?

    Read the article

  • Why do none-working websites look like search indexes?

    - by Asaf
    An example of this could be shown on every .tk domain that doesn't exist, just write www.givemeacookie.tk or something, you get into a fake search engine/index Now these things are all over, I use to always ignore it because it was obvious just a fake template, but it's everywhere. I was wondering, where is it originated (here it's searchmagnified, but in their website it's also something generic) ? Is there a proper name for these type of websites?

    Read the article

  • Hosting multiple low traffic websites on ec2

    - by Niko Sams
    We have like 30 websites with almost no traffic (<~10 visits / day) which are currently hosted on a dedicated server. We are evaluating hosting on Amazon EC2 however I'm not sure how to do that properly. One (micro) instance per website is too expensive ~10 websites on one instance (using apache virtual hosts) make auto scaling impossible (or at least difficult) Or is cloud computing not suitable for such a usecase?

    Read the article

  • How do I set up anonymous email forwarder using cPanel?

    - by Gravitas
    Some companies demand your email address, then send you spam. I'm quite familiar with cPanel. How would I set up an anonymous email forwarder, so I can give them a valid email address, and kill that email address if the company turns into an evil spammer? Note that to be effective, it would have to filter out any email addresses listed in the body of the forwarded email (otherwise those email addresses will end up on their spam list too).

    Read the article

  • Firefox stalls on rendering when chrome doesn't

    - by amccormack
    I have a webpage that loads quickly 100% of the time in chrome, but only 10% or so of the time in Firefox. Looking at the fiddler capture, Firefox only loads 2 of the 100ish files being pulled before it hangs. The error does not seem to be on the server or network side, however, because Chrome never encounters a problem. How do I find the root of this stall? While I suspect Firefox's javascript execution is what is causing the hang, are there any particular methods to narrow down the search for the bad code?

    Read the article

  • SEO for site with 301 redirect on root domain to subfolder

    - by Kim
    I've been asked to do SEO for a site. The site is made using Wordpress and prestashop. Because of this the root domain has a 301 redirect to a subfolder - domain/shop/ For my SEO submission work, I know it's not good practice to submit urls that have redirects on them and a lot of the time it's not allowed. After searching the net I think my best bet is to do all my site submissions using the url - domain/shop/ even though it will take a lot more listings to get them up in ranking compared to using their root domain. I'm not sure how it will work. The root domain has the greatest rank then passes rank to the rest of the site. If I'm targeting the subfolder will it work?

    Read the article

  • cPanel email doesn't seem to work - error 550?

    - by Megh
    I am fairly new to the web hosting game, so bear with me :) Recently set up a VPS with cPanel and WHM. Everything is going well so far, I've created a user domain and transferred my website there, managed a couple of databases with phpmyadmin, everything was going great until I started messing around with email. I made an email account [email protected] through cPanel, although when I try and email this address I get the following error: Technical details of permanent failure: Google tried to deliver your message, but it was rejected by the recipient domain. We recommend contacting the other email provider for further information about the cause of this error. The error that the other server returned was: 550 550 Unknown user (state 13) Quite unsure of what to do next, in all honesty.

    Read the article

  • How to protect SHTML pages from crawlers/spiders/scrapers?

    - by Adam Lynch
    I have A LOT of SHTML pages I want to protect from crawlers, spiders & scrapers. I understand the limitations of SSIs. An implementation of the following can be suggested in conjunction with any technology/technologies you wish: The idea is that if you request too many pages too fast you're added to a blacklist for 24 hrs and shown a captcha instead of content, upon every page you request. If you enter the captcha correctly you've removed from the blacklist. There is a whitelist so GoogleBot, etc. will never get blocked. Which is the best/easiest way to implement this idea? Server = IIS Cleaning out the old tuples from a DB every 24 hrs is easily done so no need to explain that.

    Read the article

  • What nameserver should I use?

    - by Qmal
    Let's say that I have site.com website that I bought at one place, but want to host on another place. I don't know what to do. Here is scenario. I bought site.com on company that is using their own nameservers - ns.a.com. And linked website to their own servers. I bought hosting on another company that is using nameservers - ns.b.com. Should I just change ns.a.com on my DOMAIN to ns.b.com? Or should I link all DNS entries on my domain control panel to host ip addresses?

    Read the article

  • Is there a way to disallow only crawling in https in robots.txt?

    - by David Wilkins
    I just realized that Bingbot is crawling my company's website's pages over https. Bing already crawls the site over http, so this seems frivolous. Is there a way to specify Disallow: / for https only? According to Wikipedia, each protocol has its own robots.txt And according to Google's Robots.txt Specification, the robots.txt applies to http AND https I don't want to Disallow: / for Bing totally, just over https.

    Read the article

  • Why do some user agents have spam urls in them?

    - by Erx_VB.NExT.Coder
    If you go to (say) the last 100 entries (visits) to the botsvsbrowsers.com website (exact link, feel free to take a look: http://www.botsvsbrowsers.com/recent/listings/index.html ), you'd notice that almost every User Agent that has the keywords "Opera" and "Presto" inside them, will almost certainly have a web link (URL/Web Address) inside it, and it won't just be a normal web address, but a HTML anchor tag/link to that address. Why is this so, I could not even find a single discussion about it on the internet, nowhere, I tried varying my search terms many times. If the user agent contains the words "Opera" and "Presto" it doesnt mean it will have this weblink, but it means there is about an 80% change that it will. A typical anchor tag/link inside a user agent will look like this: Mozilla/4.0 <a href="http://osis-uk.co.uk/disabled-equipment">disability equipment</a> (Windows NT 5.1; U; en) Presto/2.10.229 Version/11.60 If you check it out at the website, http://www.botsvsbrowsers.com/recent/listings/index.html you will notice that the back and forward arrows are in there unescaped format. This isn't just true for botsvsbrowsers, but several other user agent listing sites. I'm really confused and feel line I'm in a room full of 10,000 people and am the only one seeing this ghost :). If I'm doing statistical analysis, should I include or exclude this type of user agent from my listing (ie: are these just normal users who've set their user agents to attempt to drive some traffic to their sites as they browser the web), or is there something else going on? The fact that it is so consistent in terms of its format leads me to believe that it is an automated process (the setting or alteration of the user agent) so I cannot decide or understand the process by which this change is made (I know how to change a user agent), but unsure which program or facility is doing this, especially since it is exclusive to Opera (Presto) user agents that are beyond I think an 8 or 9 point something browser version. I've run some statistical tests, parsing entries from all over the place, writing custom programs, to get a better understanding of this. Keep in mind that I see normal URL's in user agents infrequently, they are just text such as +http://www.someSite.com appended to a user agent normally, especially if its a crawler or bot it provided its service URL, this is normal and isnt done with an embedded link (A HREF=) etc, so I'm not talking about "those".

    Read the article

< Previous Page | 190 191 192 193 194 195 196 197 198 199 200 201  | Next Page >