Search Results

Search found 11896 results on 476 pages for 'smart pro'.

Page 235/476 | < Previous Page | 231 232 233 234 235 236 237 238 239 240 241 242  | Next Page >

  • Can SSL Wildcards have multiple/nested levels of wildcard?

    - by Don Faulkner
    I know that an SSL wildcard certificate (*.example.org) can be used to support many names under the domain (a.example.org, b.example.org, c.example.org). I also know that the * is only good for matching a single level of name. That is, *.example.org will not work on a.b.example.org. What if I used a certificate with the name ..example.org? I'd like to build a certificate with the following name configuration: CN=example.org subjectAltName=DNS:example.org, DNS:*.example.org, DNS:*.*.example.org, DNS:*.*.*.example.org I've tried building a few like this as self-signed certificates, but I've not had good results. For example, chrome tells me "Server's certificate does not match the URL." Is it possible to have nested wildcards in a certificate, or do the popular browsers not support this?

    Read the article

  • registration form with payment system ( paypal ) [closed]

    - by Alecs
    I'm using an ajax registration form plugin for my website and I'm thinking to implement also Paypal. Here is how I want to implement it : I have 3 labels ( Name, Phone, Email, ) and a " Buy " button. After the user is typing his name, phone and email they click on "Buy" and they will be redirected to the paypal payment page or if it's possible to stay on the same page. Probably, what I need to know is how to make the "Buy " only after the forms ( name, phone, email ) are validated. Is there a plugin, or a snippet of code already made to not start something which already exists.

    Read the article

  • Cloudflare displaying cached CSS in-line?

    - by esqew
    I recently enabled CloudFlare on my domain, and when I make HTML statements like this: <link href="css/main.css" rel="stylesheet" /> The CSS gets displayed in-line, like this: <style>body{padding-top:40px}span.light{font-weight:lighter!important}span.title{font-size:60px;line-height:1;letter-spacing:-1px;color:inherit}</style> When I update the file via FTP, the changes are not reflected, which leads me to believe that this is a caching issue. Is this due to CloudFlare? If so, how do I disable the behavior? EDIT I also came to the conclusion that caching is behind the behavior after being able to see changes in the page after re-naming the CSS file.

    Read the article

  • Sudden drop of pageviews/visit and increase of bounce rate in Analytics

    - by Tebb
    Google analytics stats: 04 june 2012 Visits: 4.423 Unique visitors: 3.558 Pageviews: 77.352 Pageviews / visit: 17,49 Visit length: 00:06:26 Bounce rate: 1,09% 05 june 2012 Visits: 4.652 Unique visitors: 3.825 Pageviews: 45.087 Pageviews / visit: 9,69 Visit length: 00:06:45 Bounce rate: 19,60% From one day to another the bounce rate went from 1% to 19%, the pageviews dropped by half so did the pageviews/visit. The only thing I changed (If I remember correctly) on the site, was an advertisment that used a javascript. Could this be the reason? and, if it is, how can I know which one is the real stats?

    Read the article

  • I have domain.com and domain.org to the same site, should I use redirects to avoid duplicate content

    - by bunzip
    I have both the .com and the .org for a domain name, and using Apache I point them to the same site content. I think this might be causing problems with the Search Engines because of duplicate content. I want the .org to be the essential website. How do others handle this situation? Should I be using 301 redirects to point all the .com requests to the .org? Should I just use the link rel="canonical" on each page to point to the .org?

    Read the article

  • Two divs, one fixed width, the other, the rest

    - by Shamil
    I've got two div containers. Whilst one needs to be a specific width, I need to adjust it, so that, the other div takes up the rest of the space. Is there any way I can do this? <div class="left"></div> <div class="right"></div> // needs to be 250px .left { float: left; width: 83%; display: table-cell; vertical-align: middle; min-height: 50px; margin-right: 10px; overflow: auto } .right { float: right; width: 16%; text-align: right; display: table-cell; vertical-align: middle; min-height: 50px; height: 100%; overflow: auto } Thanks

    Read the article

  • How to install a new CA certificate on Linux?

    - by Dail
    I have bought a cheap SSL certificate to run my website using NGINX. They sent me 4 .crt files: www_mywebsite_it.crt AddTrustExternalCARoot.crt PositiveSSLCA.crt UTNAddTrustServerCA.crt I have configurate www_mywebsite_it.crt and my .key on NGinx, but I also have to install the others .crt files. How can I do it? I'm using Ubuntu. (The problem is that I see correctly the SSL certificate using Firefox, Chrome and Opera but if I use Firefox 4.0.1 (the last) I get the default Firefox alert for insecure website.) Thank you!

    Read the article

  • Determining if a visitor left your server

    - by Jeepstone
    We have an Apache server running a PHP website. The site is an e-commerce shop. We currently use Barclays as the payment provider but are seeing a lot of customers drop out at the point at which we transfer them to the payment gateway (hosted with Barclays) I can see specific instances in the shop where orders have been created but not paid/failed but I need to ascertain if the user has definitely left our server (or just failed to reach Barclays). Is there anything in any of the server/access logs that states when a user transferred to a different domain?

    Read the article

  • Duplicate pages indexed in Google

    - by Mert
    I did a small coding mistake and Google indexed my site incorrectly. This is the correct form: https://www.foo.com/urunler/171/TENGA-CUP-DOUBLE-HOLE But Google indexed my site like this: https://www.foo.com/urunler/171/cart.aspx First I fixed the problem and made a site map with only the correct link in it. Now I checked webmaster tools and I see this: Total indexed 513 Not selected 544 Blocked by robots 0 So I think this can be caused by double indexes, and it looks like the pages not selected makes the correct pages not indexed. I want to know how to fix the "https://www.foo.com/urunler/171/cart.aspx" links. Should I fix in code or should I connect to Google to re-index my site? If I should redirect wrong/duplicate links to correct ones, how should that be done?

    Read the article

  • Drive By Download Issue

    - by mprototype
    I'm getting a drive by download issue reported on www.cottonsandwichquiltshop.com/catalog/index.php?manufacturers_id=19&sort=2a&filterid=61 reported from safeweb.norton.com when I scan the root url. I have dug through the entire site architecture, and code base and removed a few files that were malicious, i upgraded the site's framework and fixed the security holes (mostly sql injection concerns)..... However this one threat still exists and I can't locate it for the life of me, or find any valid research or information on removing this type of threat at the server level, mostly just a bunch of anti-virus software wanting to sell you on their ability to manage it on the client end. PLEASE HELP Thanks.

    Read the article

  • Why Can't Computers Off My Network See the Site? [migrated]

    - by nmagerko
    Have just set up Apache, PHP, MySQL, etc. on my Ubuntu OS, and I was wondering why computers that are not on my network can not see the basic index.html that Apache uses as the default. I set up the static ip address for my computer, and I use 192.168.1.100 for computers to view the simple site. Is there something I am missing that will allow others to access my site? (It is REALLY simple; no graphics, CSS, etc.)

    Read the article

  • Are people getting away with the "follow 1000s and then unfollow" Twitter trick? [closed]

    - by Baumr
    It seems that more and more people are trying to 'cheat' their way into more Twitter followers. The basic mechanism is: Follow thousands of people on Twitter with the hope that they'll follow you back. Once it reaches a point you're happy with, start gradually unfollowing them. That way, at the end of the day, it'll look like a lot of people follow you unconditionally. I've seen self-proclaimed social media and SEO experts do this. It's clear they want to look influential — and will use black hat social media tactics to do so. I can see how it can work, so is Twitter letting them get away with it? Should it?

    Read the article

  • Avoid pagination pages from appearing higher than real content in SERPS

    - by WordPress Developer
    I have a gallery-type website that has about 20k of pages and naturally it uses pagination. However, sometimes /page/2 appears higher in search results than /post/201339 for example. I'd like to give emphasis to the actual content (posts, videos, whatever the site is about) and not on pages that merely list this content in a paginated matter. What is the best way to avoid this issue? Maybe a NOINDEX,FOLLOW meta tag on the paginated pages?

    Read the article

  • Creating sites with local ips that pointing to a distant server.

    - by fatnjazzy
    Hi. We are a company that is distributed in several places over Europe (real offices). Each office has its own domain. company.de company.co.uk company.ch And so. Our website servers are located in one place. We can't distribute our site to different locations. How can we create a local IP in each location to show our main server. so google will see us as local ip. Explanation: Google has decided to increase your PR if you have a local IP, they think that if you bought a server in a local market means that you are very serious about your business. We have 8 employees in each office, we cant have a separate server, is that mean that we are not serious about our business? no, this is y i need to create this illusion. Thanks

    Read the article

  • How to batch remove spamming users and pages they created on MediaWiki?

    - by Problemania
    I'm trying to clean up a MediaWiki instance which has been subjected to spamming and vandalism for a period of time. The current status is that there are a large number of users which only created spam pages but typically not altered legitimate pages. And there is only < 10 users which I know are legitimate users and created a small number of legitimate pages. Abstractly, my idea of fixing the messy situation is to find the complete list of users that are not in that small set of legitimate users, and use RenameUser extension to rename them all to a Spammer user, and use Nuke extension to mass delete all pages it created. Any practical advice on how to proceed? Since there are hundreds of spammer users, how do I effectively rename them? It seems Renameuser extension does not support automated batch renaming of users by allowing users to be renamed with a list or file.

    Read the article

  • Oracle EZConnect in Mediawiki

    - by raindog308
    Mediawiki supports Oracle and I'm trying to configure it in the installer. The installers says you can use EZConnect...something like: user/pass@//server.example.com/dbname or since the installer has fields elsewhere for user/pass server.example.com/dbname The installer includes a link to the EZConnect docs: http://docs.oracle.com/cd/E11882_01/network.112/e10836/naming.htm. All the examples in that doc include a forward slash. But every combination I've tried results in an error like this: Invalid database TNS "sever.example.com/service_name". Use only ASCII letters (a-z, A-Z), numbers (0-9), underscores (_) and dots (.). I can't find any examples of EZConnect that don't include a forward slash. That error is from Mediawiki, not Oracle. I'm tailing the listener log and there is no connection made - Medaiwiki is returning an error without trying to connect. I'm using php OCI8 with the Oracle instant client. I don't have a tnsnames.ora setup for this client - which is kind of the point of EZConnect. I did write a test php script that connects via oci_connect just fine. Has anyone configured Mediawiki to use Oracle with EZConnect? If so, what did you use in the installer?

    Read the article

  • Can someone explain the true landscape of Rails vs PHP deployment, particularly within the context of Reseller-based web hosting (e.g., Hostgator)?

    - by rcd
    Currently, I have a reseller account with the company HostGator. I design websites, which up until now have occasionally been wrapped in Wordpress CMSs and the like (PHP applications). I then sell hosting (of the site I've designed) to the client, which is pretty simple, in that I can simply click a button and add a new shared hosting account/site with whatever settings I want. Furthermore, I then utilize WHMCS to automate billing and account management. It's a nice package and pretty simple. I pay something like $25 a month, and can sell a hundred accounts under this (because my clients bandwidth requirements are low). Now I am finding the need to develop more customized applications, including a minimalist CMS and several proprietary things. I soon anticipate developing these apps for clients as well. Thus, I've spent the past few months learning Rails, and it's coming along well now. The thing that has nagged at me all along, though, is the deployment issue. I can't wrap my brain around it. It seems like all of the popular options (Heroku, etc) have nice automation with git and are set up in the "Rails Way". I get that (sort of). But it's terribly expensive... a single dyno, a helper, and the cheapest database (which they say is mainly suitable for testing) that isn't limited to 5MB runs $51. This is for ONE app!!! Throw in a "production" DB and you're over $200. This is like... the same prices as getting a server somewhere, right? Meanwhile, going back to what I guess is a "traditional" hosting environment with Hostgator, their server only has Ruby 1.8.7 and Rails 2.3.5... No Rails 3. AND, no Passenger (not that I really understand the difference in CGI or mod_rails or whatever, but they say Passenger is the simplest). So I'm to understand that if I build an app in Rails 3, it won't run at all on this host? But damn, I already have these accounts under my reseller account there, all running static html and/or PHP stuff, right? So what now? How do I get all of this under one simple (and affordable) roof? Forgive my ignorance, but I just don't get it. Managing a VPS is cool and all, but entails learning server admin stuff and security... And it's expensive. I get that a shared and/or reseller "server-based" (forgive the terminology) may be inadequate for large-scale apps that use a lot of bandwidth... But what about for those of us who are building real (but small and low bandwidth) apps (with Rails) and who want to deploy them simply, cheaply, using the same conceptual approach as PHP? Even after learning all of this Ruby and Rails stuff for months, I'm questioning whether it's worth it when it comes to deployment. I want to build a small app, upload it to my home directory on a shared server account, and just make it run. Why should that be so hard? Am I just choosing the wrong language/framework? Forgive my ignorance in the subject; these questions are not rhetorical; just trying to learn here. So: 1) I'd appreciate if someone could give me a good rundown of how to understand deployment in Rails vs. PHP. 2) I'd appreciate if someone could address my issue with running a hosting/web business around reseller hosting (Hostgator) while also being able to host Rails apps. Can it be done? And how can a company like Hostgator completely ignore what's current in Rails/Ruby? Thanks.

    Read the article

  • apache2 and htaccess help

    - by user1052448
    For some reason domain.com/YYYY-MM-DD redirects to domain.com/var/htdocs/public_html RewriteEngine On RewriteCond %{HTTP_HOST} ^[^\./]+\.[^\./]+$ RewriteRule ^/(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] RewriteBase / RewriteRule ^index\.php$ - [L] RewriteRule ^archive/index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^. /archive/index.php [L] trying to get anything after www.domain.com other than index.php and archive/index.php to display mysql content on archive/index.php (by grabbing PHP Request URI). The browser URL should remain www.domain.com/YYYY/MM/DD/blog-title or www.domain.com/YYYY/MM/ to display all posts from YYYY-MM

    Read the article

  • Good podcasting solution?

    - by burnso
    I simply ask for the best, most common and simple way to set up a podcast? Please, I need an answer so I can close this question? I run a joomla-based website for a small church and now need a simple, cheap and effective solution for an audio podcast. I am looking for a solution that will do the following: User uploads audio files to service (preferably not to our own site) that is cheap, fast and simple to use. Dropbox for example? Files are easily embeddable into Joomla website articles to be played on the spot (through simple-to-use and media player). Preferably through RSS feeds (to make it easy to update every week). Files are downloadable. Files are viewable on iPhones and other smartphones. Solution can be broadcasted via iTunes. Solution doesn't need a lot of extra, new third party software. I'd rather keep it simple and familiar than have it be a complete new system but with a steep learning curve. At the moment we use vimeo to host the podcast, through video files. But we'd like to move to something simpler that doesn't involve a series of difficult steps to upload the files to the web.

    Read the article

  • How to configure Google sitemap links in Wordpress? (without editing its HTML or PHP source code)

    - by Alexander Farber
    I run a Wordpress 3.7.1–de_DE site, but don't have much experience with it yet. When my site comes up in a Google search, there are 2 Google sitemap links displayed underneath: One of them points to a non-existent webpage /imprint though and I had to add a page at that URL to workaround this (and I want the URL actually be /impressum anyway since the site is in German and has German URLs). How to configure Google sitemap links in Wordpress (without editing its HTML or PHP source code)?

    Read the article

  • How do I prevent ISPs from killing downloads of files in mid-transfer?

    - by Gorchestopher H
    I run a small website with a few users, low traffic, mostly to share personal mp3 files with a small community. Depending on their ISP, my users can't always download or stream larger files. By larger I mean larger than 1MB. Essentially the host either stops sending, or the client stops receiving. One of the links along the connection chain simply ends its connection before the transfer completes Trace-route shows no connection issues. There are no connection issues with short transfers that don't take more than a few seconds. It's these 10 second transfers that just end up ending. Just doing a straight download with a direct link can yield this error if you have the wrong ISP. Strangely enough, this is most common with users with ISPs who are essentially independent providers that buy service via a fiber link. Unfortunately these providers aren't very knowledgeable, are unable to do any testing, and insist it's a problem with the host. I have gotten my host to transfer my site to different servers of their, to the same effect. Nearly identical sites (affiliate sites actually) experience no such issue. What can I be doing to further troubleshoot this matter? How can I prove that someone is dropping the ball, and identify who that party is? Can I do a 5Mb traceroute? EDIT Maybe I can clear up some misconceptions with my question: The files are not very large. They are simply over 2Mb. The users do not have "slow" connections, they are at least 5mbps. This "time out" happens very quickly, in the realm of 5 seconds, so I don't know if it's a timeout or not. The user often gets 1 or 2Mb in this chunk of time. I have tried streaming with a flash player. I have tried saving the target. Forcing the download. I have tried allowing the browser to stream the file. I have tried different browsers (FF, IE, Chrome). Users are able to download identical files when on different hosts.

    Read the article

  • How can I set up private, per-user sections on Joomla?

    - by Michael Paulukonis
    For this weekend's GiveCamp project, my team has been tasked with adding some functionality to an existing Joomla-powered website for a non-profit. A certain type of user will login, and have access to a personal area where they can upload files, check for messages, see tasks that have been assigned, etc. Each user would have their own area. They would not be creating pages, and their information would only be visible to themselves (or a site-administrator, of course. No sort of weird HIPAA privacy involved). None of us have worked with Joomla before, but we'd like to help this non-profit. We're not sure if we're searching using the wrong terms, or if we're just not finding it. Is such a solution possible in Joomla? And/or are we better off building some standalone solution that interfaces with the same mySQL database as Joomla?

    Read the article

  • Website that allows twitter like message accessed via tinyurl

    - by blunders
    Looking for a website that allows me to post twitter like messages that are accessed via tinyurls and that I'm able to create an account to get reports on the access of each of these messages. Basically, unlikely twitter each message would require the tinyurl to access the message, but the would be no central author index of messages, but the author would not only be able to login and see a centralized listing of all the messages, but also reports on if the messages had been accessed.

    Read the article

  • can I forward "referrer" information to other address?

    - by user5679
    I have two addresses for two servers: www.urlA.com www.urlB.com I have all my websites installed in www.urlB.com, but visitors recognize www.urlA.com primarily. I have www.urlA.com/index.php as the following <?php header('Location: http://www.urlB.com/'); ?> But, when I use this forwarding method, the tracking javascript in www.urlB.com cannot recognize where the visitors are from. I only obtain "NO REFERRING LINK" What should I do to do the following two jobs: 1. to forward urlA.com to urlB.com 2. to receive the referrer information

    Read the article

  • Good analytics tools that can track visitor actions from a particular source?

    - by tnorthcutt
    Are there good tools that can track what actions a certain subset of visitors (i.e. from a particular source) do once they're on your site? As far as I know (which could be wrong), Google Analytics can't do this beyond telling you how long they stayed, bounce rate, and average number of pages. I'm looking for something that can tell me which links they clicked on, and if possible break it down per-visitor. Free solutions would be great, but I'm anticipating that this would require a paid solution.

    Read the article

< Previous Page | 231 232 233 234 235 236 237 238 239 240 241 242  | Next Page >