Search Results

Search found 9763 results on 391 pages for 'ys pro'.

Page 80/391 | < Previous Page | 76 77 78 79 80 81 82 83 84 85 86 87  | Next Page >

  • What are some reputable merchant account providers for high risk payment web sites?

    - by GregH
    I am helping to set up an online cigar web site. However, it has become a real pain to take payments online since tobacco is considered a "high-risk" item and nobody will provide a merchant account to process the payments. It looks like there are companies that specialize in high-risk merchant accounts. I was wondering if anybody could recommend a high-risk merchant account and payment processing provider?

    Read the article

  • fsockopen() error : Network is unreachable port 43 in php [closed]

    - by hamid
    i've writed some Php code that lookup for domain (whois) but it fails !! this is some of my code : function checkdomain($server,$domain){ global $response; $connection = fsockopen($server,43); fputs($connection, "domain " . $domain . "\r\n"); while(!feof($connection)){ $response .= fgets($connection, 4096); } fclose($connection); } checkdomain("whois.crsnic.net","www.example.com"); the code work on my localhost ( apache,php,mysql, OS - Win XP ) but when i uploaded it to my host (Linux) it failed. and i always see the Below Error/message : Warning: fsockopen() [function.fsockopen]: unable to connect to whois.crsnic.net:43 (Network is unreachable) in /home/hamid0011/public_html/whois/whois.php on line 37 what should i do ? is this my host's problem or whois server ( but it work in localhost ) or my code ? TNX

    Read the article

  • Apache URL Rewrite

    - by sgtbeano
    I'm trying and failing to get a URL rewrite working, firstly I'm doing it in the vhost declaration, is that right? What I'm trying to do is take any URL which has; view.php?id=[a 1 or multidigit number] and rewrite it to view.php?id=[number]&section=1 Any help would be greatly appreciated, thanks for looking. Okay, so I tried the suggestion below (thanks for that) and now have this in my vhost file but still no effect; NameVirtualHost *:80 <VirtualHost *:80> ServerAdmin ######## DocumentRoot "########" ServerName ######## ErrorLog "logs\########.log <Directory "########"> DirectoryIndex index.php index.html AcceptPathInfo on Order allow,deny Allow from All </Directory> <Location /> RewriteEngine on RewriteRule ^/view.php?id=([0-9]*)$ /view.php?id=$1&section=1 [R] </Location> </VirtualHost> Any more suggestions? Thanks again

    Read the article

  • News Portal CMS

    - by George Grigorita
    I am looking for a specific news portal CMS. I know all the major "general" CMS (like WordPress, Drupal or Joomla) and even the less known ones (like TYPO3, Expression Engine, Text Pattern or Concrete5). I'm already working with a Drupal distribution called OpenPublish and another WordPress installation to determine which would be better, but these are more of a Plan B. I would like to work directly with a CMS that was build exactly for this kind of tasks specific to a news / media portal. It doesn't matter if the CMS is commercial (however, I don't want to pay a monthly fee) or free, but I need to be able to use it on my own server / hosting and I need to be able to access it's source code (not to modify it, but to integrate it with future plugins / modules). If you know any CMS that qualifies for this job, please let me know. In the last few days I was all over Google but I couldn't anything worth mentioning.

    Read the article

  • Need assistance matching a general theme style as well as eCommerce capability

    - by humble_coder
    I'm in the process of acquiring a new design client. They are getting into the business of "auto parts wholesaling" and they want a storefront. My preference is/was to create something from scratch. However, here is an established trend in their particular market (similar parts, layout, etc). They insist on following the existing visual trend, as per the following: http://www.xtremediesel.com/ http://www.thoroughbreddiesel.com/ http://www.alligatorperformance.com/ My plan of attack at this point is to find a comparable WP theme and a flexible (but useful) backend/product management. Their current demo site (which their previous developer made a stab at) is using Pinnacle Cart. It is no where near what they need, nor is it intuitive to work with. I was actually considering Magento for its greater abilities but I'm still considering options. That said, my two primary dilemmas are as follows: 1) I need a theme that mimics the general style of those listed. They explicitly said they didn't want anything too clean (e.g. ThemeForest, Woothemes) as it "wasn't rugged or busy looking enough" for their field. 2) I need a WP/Magento/WP e-Commerce (or any one of a host of other) plugin that will allow for bulk import/update of nearly 200,000 products, descriptions and images. I'm not opposed to manually interfacing with the DB for import, but in the end, I need a store/system that doesn't needlessly add 50 tables to accommodate some "wet behind the ears" concept of table normalization and is easy to add to. Anyway, if anyone has any quality suggestions regarding either of these issues, it would be most appreciated. Best.

    Read the article

  • Apache no longer starts at Windows boot up

    - by w3d
    I have Apache installed as part of XAMPP - local test server. It is configured as a Windows (XP) Service. Startup type is "Automatic". For a long time now it has always started when Windows boots up, but recently this has stopped happening. I now need to start it manually via the XAMPP Control Panel - at which point it appears to start up perfectly OK. The only recent updates to the machine (that I recall) are Windows Updates - none of which appear to have "known issues" that relate to this. And updates to Google Chrome. Any ideas what could prevent Apache from starting automatically at Windows (XP) boot up? EDIT#1 There are 2 related Errors in my system event log regarding the Service Control Manager: Timeout (30000 milliseconds) waiting for the Apache2.2 service to connect. The Apache2.2 service failed to start due to the following error: The service did not respond to the start or control request in a timely fashion. When I manually start the Apache server after boot up there are 2 "information" events stating that it was "sent a start control" and that it "entered the running state". Although I notice it appears to take 19 seconds between the start control being sent and entering a running state - according to the event log. So, maybe 30 seconds during boot up isn't long enough (anymore) for Apache to start??

    Read the article

  • Bingbot requests from Google IP address

    - by JITHIN JOSE
    We have some suspicious requests to our server, 74.125.186.46 - - [24/Aug/2014:23:24:11 -0500] "GET <url> HTTP/1.1" 200 16912 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" 74.125.187.193 - - [24/Aug/2014:23:24:12 -0500] "GET <url> HTTP/1.1" 200 20119 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" As it shows, user-agent shows it is bingbot. But whois data of IP address(74.125.186.46 and 74.125.187.193) shows it is from google servers. So is it Google,Bing or any other content scrappers?

    Read the article

  • Why does SEO based code tips not appear to affect ranking?

    - by Ben
    I've been researching various methods for SEO where pages have precise titles, keywords are highlighted with h tags and tick the many boxes stated in good page mark up for SEO. However when looking at some top ranked search sites on google for key terms they have terrible SEO based mark up. Really long page titles, no tags, limited appearance of keywords in the text and so on. SEO analysis services rate them lower than other sites, yet these sites rank really high. Even with a low number of back-links they are high, so I don't understand how these sites earn the position when they appear inferior to those below them which have better mark up and links. I don't want to cause trouble my mentioning sites or keywords etc. but looking in google at 'executive search' the roughly 5th placed site makes no sense why it should be highly rank, especially with all the added .swfs. The same applies for the top of 'Japan Executive Search'. My main point is that these sites seem to not have all the important structural rules stated in seo page rating applications and general suggested best practice, nor do they show large back-links. It makes me feel like there is no point bothering to write decent mark up if it really doesn't matter. Can anyone explain how sites with such mark-up, and low back-links can outrank well written and structured sites with greater linkage? Sorry if this is a fuzzy question, I want to avoid singling out any sites for example, but it really has me perplexed that sites which appear to ignore the suggested best practices rank so well.

    Read the article

  • Google+1 button strategy - Combined +1s or separate +1s?

    - by nctrnl
    I have included the Google+1 button on my blog. Each post outputs a +1 button on the bottom. Depending if you are viewing the actual post or just the main page the +1 button will "+1" either the post address or blog website address. This made me think for a bit if the +1 button should be configured to +1 the blog section (www.example.org/blog), +1 the main website address (www.example.org), or +1 individual posts?

    Read the article

  • Does redirect popup window affect SEO?

    - by Joseph
    We have multiple websites, each site servers number of countries, and we used to have Geo-Ip Auto redirect system (no one likes auto-redirect), so we implemented another redirect system also uses Geo-IP database, but showing a pop-up window (HTML layer pop-up, so it can't be rejected), this window asks the visitor if he would like to continue with this page or go to the correct website of his country. We also added a test line before showing the pop-up, so if the visitor is Googlebot, the popup will not show up :). I was wondering if this effects our websites SEO?

    Read the article

  • DNS Configuration when registrar and host are two different companies

    - by dclowd9901
    I'm a total noob when it comes to DNS configuration. My client bought a domain through one company and is hosting their site with another (a virtual dedicated server). I can't find anything on the web that explains one's way through this setup. Where do I start? Which nameservers do I use? Which company's zone files do I edit? Basically it boils down, for me that I don't understand which company takes the lead, and which piggybacks off their configuration. Thanks in advance for the help.

    Read the article

  • Getting bank account information from a bank and displaying on a website [closed]

    - by Ali Syed
    Hello I am looking for a way to get bank account information (transactions and balance) from a financial institution and display it on a website. The question is vague intentionally.... Everything is open. I haven't chosen a bank, serverside technology or front end technology. The idea is to have a script run automatically periodically (once or twice a day) and get the latest account information from the bank server automatically. Probably something in the direction of OFX (Open financial exchange), HBCI (home banking c.. interface), fnts or something like it. Even working over a closed source API is not out of question: Wesabe or Mint or something. Paypal is not an option because it won't work in India or Pakistan. cheers *Explanation: I have an exclusive small club. My members make irregular payments. These transactions should be online for all MEMBERS (with login) to see *

    Read the article

  • Which one is better? To have link on one page or on all pages?

    - by Ervin
    I have a website: http://www.amenajari-gradini-mures.ro and I will put links on http://www.casesigradini.ro . I would like to know which one is better from the point of view of SEO: - to have one single link on the homepage OR - to have a link on every page (about 48000 pages) ? Right now I got my link up on every page. But if it's better to have it on one page only (or maybe on few main pages ) then i'll take it out the rest. Please give some arguments for your answers. Thanks

    Read the article

  • RewriteRules targeting a directory result in a gratuitous redirect

    - by MapDot
    I have a standard CMS-like RewriteRule set up in my .htaccess: RewriteRule ^(.+)$ index.php?slug=$1 Let's say I have a directory called "foo" in the root directory. For some reason, if you hit the page it causes a redirect: http://www.mysite.com/foo -- http://www.mysite.com/foo?slug=foo Removing the directory fixes the problem, but unfortunately, it's not an option. Does anyone know of a workaround?

    Read the article

  • Submitting a sitemap to take care of inherited Google crawler errors

    - by leeand00
    I have an awful lot of Google Crawler errors (1000 or so) after I inherited a site that the previous owner migrated without moving much of their content. Would generating a map of the current site and submitting it to Google help fix this? Is there any quicker, automated way to eliminate errors other than clicking each and every site error? Note: I have already tried automating this on my own.

    Read the article

  • Tracking click conversions with Google Analytics

    - by Joel
    Is there anyway I can use Google Analytics to track click conversions on a link? For example, if I have a link to www.a.com , is it possible for google to track the number of times that particular link was shown on my page and then track how many times it was really clicked? The problem is that I do not show the link to www.a.com every time the page loads. I am using a random function (server side) to generate a different link everytime. I would like Google Analytics to provide me with the click conversion for each of the links I choose to show the user. Thanks, Joel

    Read the article

  • What constitutes a "substantial, good-faith effort to remove the links"

    - by Luke McCallum
    We engaged the services of a 3rd party SEO consultant to assist us in managing our Meta data and to write regular blogs on our site http://cyberdesignworks.com.au Without our authorisation, the SEO also ran a link building campaign which has seen us Penguin slapped and we no longer appear in Google for a number of our core keywords. Since notification by Google that we have "unnatural links" back in March we have undertaken a significant campaign to rid ourselves of these dodgy backlinks by a number of methods. I have just received feedback on my 4th or 5th resubmission which is still advising that we need to make a "substantial, good-faith effort to remove the links" before Google will reconsider us for inclusion. After the effort that I have gone through to get links removed, I am now at a loss as to what else I can do to demonstrate "substantial, good-faith effort to remove the links". Below is a summary of the actions that we have taken to date. According to http://removem.com we had about 5584 back-linking domains. Of those we have successfully contacted and had removed links from 344 domains We ignored links from 625 domains as they were either legitimate press releases, natural backlinks or client websites containing an attribution link in the footer that points back to us. Due to our efforts, or the sites simply becoming defunct, removem.com reports that links from 3262 domains have been removed. We have contacted but are yet to receive feedback from 1666 domains so we can assume that the backlinks remain. We have configured an automatic 301 redirect for each of the links from these 1666 domains to point to http://redirects.sanscode.com/ which we are calling our Bad Link Catcher (a stroke of genius I thought). i.e http://www.mysimplewebdesign.com/create-a-perfect-webpage-with-four-important-tips-from-sydney-web-development-service-companies.php As we are a web design agency, we have a large number of client websites which contain an attribution link in their footer which points back to us. We have gone through the vast majority of these and updated these links to replace anchor text with an image and rel="nofollow" link. i.e <a rel="nofollow" target="_blank" href="http://www.cyberdesignworks.com.au/"><img src="https://sessions.sanscode.com/site/assets/media/badges/Badge_CDW_SANSCODE.png"></a> See http://www.milkatwork.com.au/ An export from http://removem.com detailing the number of times we have contacted each link and whether it is still found or not was also supplied with each resubmission. The total back links reported in Google Web Master Tools has dropped from over 100K to 87K and I expect it to drop significantly lower once Google re-crawls each back-linking page. Based on all of the above, I am not sure what else I can do to to demonstrate a "substantial, good-faith effort to remove the links". I would sincerely appreciate any feedback or suggestions that you may have as I am out of ideas.

    Read the article

  • How to get Magento to update order status when PayPal returns IPN message?

    - by Nick
    When someone checks out in Magento with PayPal, and PayPal flags their payment for review, Magento correctly sets the order status to "Payment Review". However, if after a day or two PayPal decides the order is OK, it sends an IPN message to Magento with the proper payment status of "Pending" and pending reason of "authorization". I can see this IPN message in Magento's paypal logs (and can simulate it with the sandbox), however, when Magento receives this message it does not update its order status. Why not and how can this be fixed? I am using Magento 1.5.1.0.

    Read the article

  • How can i point wildcard domains to a folder in apache

    - by Abishek R Srikaanth
    I am developing an app using PHP and deploying it on Apache on the Amazon AWS environment. This app requires to be made available to customers from their own chosen domain name? How can i acheive this? For example www.customer1.com = /var/www/myapp.mydomain.com www.customer2.com = /var/www/myapp.mydomain.com I would like to do this similar to how bitly enables shortened url's for custom domains. www.myshrturl.com is dns configured to a CNAME - cname.bitly.com Appreciate if someone could help me acheive this functionality. If there are any other details required, please let me know, I shall update the same.

    Read the article

  • Tracking multiple subdomains and domains going to the same site, separately in Google Analytics

    - by miles
    I have a new site that has multiple top-level domains and subdomains all going to it: www.domain.com, campaign.domain.com, chicago.domain.com, domain2.com - all go to the same site/site directory. Right now I have one Google Analytics account profile set up for it, but I want to be able to track the traffic that is hitting those different URLs separately. The domains are being routed on the server-side (not .htaccess). How can I do this in Google Analytics? Do I need to create filters? Or create different profiles for each domain?

    Read the article

  • Quoting people for website dev. work

    - by Jason
    Hi All, I have recently given some quotes to a few people. And I need some advice about how things should be done... Q1: I've seen, heard of and read about a lot of developers using free resource sites online to obtain free Privacy Policy, Disclaimers etc for their/customers websites. A customer I quoted the other day expected me to write/get a disclaimer for their site. Who in their right mind would expect a document like that from a Web Developer? I just told them that they need to sort that stuff out themselves with a Lawyer or something, and then to send it to me so I can paste it on a webpage for them. Q2: If you're charging per hour, and you estimate that the project would take 1week to finish (including testing/releasing), but you soon realise that you'll require more time, do you RE-quote them? Or do you just finish off the site at the original quote price? Q3: How do you figure out how much you will charge your customers? Do you charge per-feature, or per hour, or per day, or all of the above? Thanks :)

    Read the article

  • Embarking on a website redevelopment and all developers pushing to move to ASP.NET 4.0

    - by Sue
    Our company is going through a website redevelopment / retooling exercise and we are not quite sure which direction to take. We are told that the website was built in ASP classic and that we should be moving to ASP.NET 4.0. Some developers refuse to do any work in the ASP classic framework citing the advantages of ASP.NET 4.0-- stability, compilation, language support. We are generally happy with our website as is. There are some kinks in the backend involving forms and there is little integration between the CRM of the website and any content management system. Does the move from ASP classic to ASP.NET 4.0 give major advantages to the integration between how content is created, and delivered to our customers?

    Read the article

  • Hiding php includes from search spiders?

    - by 21stcn
    Quick and simple question. I have 80+ html files which I want to be crawled. They are individual product pages. Each of these pages calls its content using php includes. These php include files are in a separate folder on the server and contain the core content for the individual product pages. I just wanted to ask, if I use robots.txt or .htaccess to prevent crawling of the directory that holds the php content files, will there be no issue crawling the html pages which include these files? What I want to achieve is have the html files indexed with the php content included in them, but I don't want visitors landing on the php content pages, nor have these php files indexed as duplicate content. Just clarification needed as to whether it is safe to block spiders from accessing the php folder, without this affecting the html files being indexed with the included content. Is this the best way to do things? Or should I just leave the content php files to be crawled?

    Read the article

  • Which shopping cart / ecommerce platform to choose?

    - by fabien7474
    I need to build an ecommerce website within a tight budget and schedule. Of course, I have never done that before, so I have googled out what my solutions are and I have concluded that the following were not valid candidates anymore : Magento : Steep learning curve osCommerce : old, bad design, buggy and not user-friendly Zencart, CRE Loaded, CubeCart : based on osCommerce Virtuemart, uberCart, eCart : based on CMS (Joomal, Drupal, WordPress) that is not necessary for my use-case So I finally narrowed down my choices to these solutions : PrestaShop : easy-to-use, great templating engine (smarty) but many modules are not free buy yet indispensable OpenCart : security issues and not a great support from the main developer. See here and here. So, as you can see, I am a little bit confused and if you can help me choosing an easy-to-use, lightweight and cheap (not-necessarily free) ecommerce solution, I would really appreciate. By the way, I am a Java/Grails programmer but I am also familiar with PHP and .NET. (not with Python or Ruby/Rails) EDIT: It seems that this question is more appropriate for the Webmaster StackExchange site. So please move this question to where it belongs (I cannot do that) instead of downvoting it. BTW, I have found out a question quite similar on SO (http://stackoverflow.com/questions/3315638/php-ecommerce-system-which-one-is-easiest-to-modify) which is quite popular.

    Read the article

  • Legal responsibility for emebedding code

    - by Tom Gullen
    On our website we have an HTML5 arcade. For each game it has an embed this game on your website copy + paste code box. We've done the approval process of games as strictly and safely as possible, we don't actually think it is possible to have any malicious code in the games. However, we are aware that there's a bunch of people out there smarter than us and they might be able to exploit it. For webmasters wanting to copy + paste our games on their websites, we want to warn them that they are doing it at their own risk - but could we be held responsible if say for instance a malicious game was hosted on an important website and it stole their users credentials and cause them damage? I'm wondering if having an HTML comment in the copy + paste code saying "Use at your own risk" is sufficient.

    Read the article

< Previous Page | 76 77 78 79 80 81 82 83 84 85 86 87  | Next Page >