Search Results

Search found 9763 results on 391 pages for 'ys pro'.

Page 215/391 | < Previous Page | 211 212 213 214 215 216 217 218 219 220 221 222  | Next Page >

  • Can someone explain the true landscape of Rails vs PHP deployment, particularly within the context of Reseller-based web hosting (e.g., Hostgator)?

    - by rcd
    Currently, I have a reseller account with the company HostGator. I design websites, which up until now have occasionally been wrapped in Wordpress CMSs and the like (PHP applications). I then sell hosting (of the site I've designed) to the client, which is pretty simple, in that I can simply click a button and add a new shared hosting account/site with whatever settings I want. Furthermore, I then utilize WHMCS to automate billing and account management. It's a nice package and pretty simple. I pay something like $25 a month, and can sell a hundred accounts under this (because my clients bandwidth requirements are low). Now I am finding the need to develop more customized applications, including a minimalist CMS and several proprietary things. I soon anticipate developing these apps for clients as well. Thus, I've spent the past few months learning Rails, and it's coming along well now. The thing that has nagged at me all along, though, is the deployment issue. I can't wrap my brain around it. It seems like all of the popular options (Heroku, etc) have nice automation with git and are set up in the "Rails Way". I get that (sort of). But it's terribly expensive... a single dyno, a helper, and the cheapest database (which they say is mainly suitable for testing) that isn't limited to 5MB runs $51. This is for ONE app!!! Throw in a "production" DB and you're over $200. This is like... the same prices as getting a server somewhere, right? Meanwhile, going back to what I guess is a "traditional" hosting environment with Hostgator, their server only has Ruby 1.8.7 and Rails 2.3.5... No Rails 3. AND, no Passenger (not that I really understand the difference in CGI or mod_rails or whatever, but they say Passenger is the simplest). So I'm to understand that if I build an app in Rails 3, it won't run at all on this host? But damn, I already have these accounts under my reseller account there, all running static html and/or PHP stuff, right? So what now? How do I get all of this under one simple (and affordable) roof? Forgive my ignorance, but I just don't get it. Managing a VPS is cool and all, but entails learning server admin stuff and security... And it's expensive. I get that a shared and/or reseller "server-based" (forgive the terminology) may be inadequate for large-scale apps that use a lot of bandwidth... But what about for those of us who are building real (but small and low bandwidth) apps (with Rails) and who want to deploy them simply, cheaply, using the same conceptual approach as PHP? Even after learning all of this Ruby and Rails stuff for months, I'm questioning whether it's worth it when it comes to deployment. I want to build a small app, upload it to my home directory on a shared server account, and just make it run. Why should that be so hard? Am I just choosing the wrong language/framework? Forgive my ignorance in the subject; these questions are not rhetorical; just trying to learn here. So: 1) I'd appreciate if someone could give me a good rundown of how to understand deployment in Rails vs. PHP. 2) I'd appreciate if someone could address my issue with running a hosting/web business around reseller hosting (Hostgator) while also being able to host Rails apps. Can it be done? And how can a company like Hostgator completely ignore what's current in Rails/Ruby? Thanks.

    Read the article

  • Web pages with mixed ownership photos

    - by dstonek
    I have a photo website. 15% of the photos belong to approved registered users. They agree my terms about uploading their images in my web pages. I include a photographer credit on right bottom corner. About identifying the site with google, every page contains a google+ button to MY google+ page it also contains <link href="https://plus.google.com/nnnnnnnnnn/" rel="publisher" /> I need some advice in order to respect google rules about my pages containing other photographers images not to be penalized because of possible duplicated or interpreted as stolen content. My concern is also about adding G+ links (to MY photo page) and Google publisher id would harm my site rank because of pages containing third-party photos.

    Read the article

  • Why my web site was visited by ARPA?

    - by ilhan
    20 minutes ago a user-agent with 66.116.153.122 IP address has visited my web site. It's domain is rev.opentransfer.com.122.153.116.66.in-addr.arpa. User-agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.2.11) Gecko/20101012 AskTbSPC2/3.8.0.12304 Firefox/3.6.11. Language: en-us,en;q=0.5,de-de,de;q=0.8. Compression: gzip,deflate. Oh, and my domain name ends with .name. Why ARPA has visited me?

    Read the article

  • Duplicating content from another site and adding value (summaries, statistics) - ranking and courtesy

    - by Krastanov
    I am working on a site that takes a governmental data base, provides a number of statistical and other summaries and also post the original data. However this data (mostly long pieces of text) is also published on the official governmental site (without the added value of summaries). Should I worry about google ranking due to this duplication? What is the preferred way to point to the official source of the information? There is no advertisement on my site. My site is ".com". The governmental site is ".bg".

    Read the article

  • Summary of usage policies for website integration of various social media networks?

    - by Dallas
    To cut to the chase... I look at Twitter's usage policy and see limitations on what can and can't be done with their logo. I also see examples of websites that use icons that have been integrated with the look and feel of their own site. Given Twitter's policy, for example, it would appear that legal conversations/agreements would need to take place to do this, especially on a commercial site. I believe it is perfectly acceptable to have a plain text button that simply has the word "Tweet" on it, that has the same functionality. My question is if anyone can provide online (or other) references that attempt to summarize what can and can't be done when integrating various social networks into your own work? The answer I will mark as the correct one will be the one which provides the best resource(s) giving the best summaries of what can and can't be done with specific logos/icons, with a secondary factor being that a variety of social networking sites are addressed in your answer. Before people point to specific questions, I am looking for a well-rounded approach that considers a breadth of networks and considerations. Background: I would like to incorporate social media icons and functionality, but would like to consider what type of modifications can be done without needing to involve lawyers. For example, can I bring in a standard Facebook logo, but incorporate my site color into the logo? Would the answer differ if I maintained their color, but add in a few pixels of another color to transition? I am not saying I want to do this, but rather using it as an example.

    Read the article

  • Website (X)HTML Code Change Detection [closed]

    - by 0pt1m1z3
    I am looking for an enterprise-grade service or a tool that can be used to scan / fingerprint websites and notify when major XHTML code changes are detected. The tool should be able to continuously scan thousands of websites and determine the percentage of HTML code that has been modified since the last run. And then either save the data where it can be easily accessed or send periodic notifications. I know of services like ChangeDetect.com, but they don't do markup only changes and instead focus on everything, including content. We don't really care about presentation content, because a lot of sites we need to cover are updated frequently with content.

    Read the article

  • Does e-commerce platform matter for branding

    - by c s h
    The place I work is now looking into developing a new e-commerce site on the Magento platform. Magento will fill all of our needs. I was just wondering if it is in anyway unprofessional doing it this way (Impression is something we are really worried about), will people who visit the site look at our business different knowing we used Magento or any other e-commerce platform. There are ways to find out. I use Chrome Sniffer to find out what platforms are used to develop each site, there are other tools available for different browsers. Bottom-line: Will an e-commerce platform affect the trust in my brand?

    Read the article

  • Are people getting away with the "follow 1000s and then unfollow" Twitter trick? [closed]

    - by Baumr
    It seems that more and more people are trying to 'cheat' their way into more Twitter followers. The basic mechanism is: Follow thousands of people on Twitter with the hope that they'll follow you back. Once it reaches a point you're happy with, start gradually unfollowing them. That way, at the end of the day, it'll look like a lot of people follow you unconditionally. I've seen self-proclaimed social media and SEO experts do this. It's clear they want to look influential — and will use black hat social media tactics to do so. I can see how it can work, so is Twitter letting them get away with it? Should it?

    Read the article

  • Safe way to send thousands of promotional emails

    - by Arsheep
    My new partner has an email list with over 1 Million email addresses for targeted traffic (no spam... only genuine subscribers from his last startup ) But now I have a problem.. how can I sent an email to all those email addresses ? I can't use my ISP SMTP mailer, they will block me immediately for bulk mailing. I thought of a way to send emails slowly . Like dividing them in sets of few thousands and sending to each set daily . Will it be fine solution ?

    Read the article

  • To change url to user friendly url

    - by German
    I'm re-factoring my asp.net application from asp.net 3.5 to 4.0. Also I'm changing url to user friendly url. Example /product.aspx?id=100 to /product-name/100 All my pages indexed by search engines and the site already 6 years online. I'm planning to do 301 redirect from old pages to new one. I want to make sure I won't loose the rank and traffic. Any suggestion how to do it properly?

    Read the article

  • Photos - do I really need to look for the author and ask his permission when posting them on my site?

    - by user6456
    When I find a photo somewhere on the internet, without any explicit information of whether I can re-publish it on my own website, without any hint of who is the owner/author of that photo, can I still do it? I'm puzzled here cause I've seen like millions of websites, often very big, that repost photos, most probably found via google and it's VERY unlikely they bothered to look for and contact the author of that photos. Is every one of that sites likely to be sued at any moment? What about the case of forums and content provided by users - there is virtually no way of prevention here.

    Read the article

  • How does delicious.com avoid being sued for copyright infringement?

    - by Stanish
    With the recent redesign of delicious.com, they've added a much more graphical home page. The site continues to be a service for people to bookmark and share websites they come across on the web. The delicious home is now made up of images taken from those linked sites. See for yourself at http://delicious.com I would like to know what in the law allows them to do this, considering the images represent the main content of the page, and they clearly do not own copyright to those images? I know there is some leeway given to search engines where it is considered fair use to use a small portion of the content if the aim is to lead people to the originating site. Does that apply here?

    Read the article

  • can I forward "referrer" information to other address?

    - by user5679
    I have two addresses for two servers: www.urlA.com www.urlB.com I have all my websites installed in www.urlB.com, but visitors recognize www.urlA.com primarily. I have www.urlA.com/index.php as the following <?php header('Location: http://www.urlB.com/'); ?> But, when I use this forwarding method, the tracking javascript in www.urlB.com cannot recognize where the visitors are from. I only obtain "NO REFERRING LINK" What should I do to do the following two jobs: 1. to forward urlA.com to urlB.com 2. to receive the referrer information

    Read the article

  • Is there a way to learn why Google penalized a site?

    - by pawelbrodzinski
    Is there any way to learn for sure why Google penalized a specific site? I think about situation when webmaster/site administrator is aware about Google rules and is sure they aren't breaking any, but the site is penalized nevertheless. The only information you get from Google is that they processed your reconsideration request but they neither say what is the result nor what is the penalty reason if they keep the site penalized. You can try to get information on Google webmasters forum or here but most of the time these are only speculations. Considering the site administrator tried to find out what's wrong but failed, is there a source which can tell what is the problem?

    Read the article

  • Looking for a package allowing user-entered profiles

    - by Mark
    Title was a little hard to work, but take this as an example. User goes to site, creates account, and then has his/her own profile to edit. Let's say the profile includes height, weight, sex, eye color, etc.. I've really only used wordpress before, but I'm sure something else would cater to this better. The entire site is focused around a person having their own profile page with the info they supply. Thanks!

    Read the article

  • Amazon EC2 vs Dedicated server at Hetzner, what's the use for EC2?

    - by C-Blu
    After searching the web I still can't find the reason to use EC2. What's the point to scale EC2? If you expect a huge burst in traffic, they say. OK, but what if you already have a couple of sites with good traffic, and for example medium reserved EC2 instance is not enough. You are paying $36.60(medium reserved for 1year) in EU(Ireland) + traffic + optional expenses for databases and S3 if you use them. Of course as some point when you are under $56.6-$66.1 you can optimize your hosting costs with Amazon EC2. But when you get at some point if purchase EX4 server from Hetzner, it will surpass your perfomance needs for a long time, before you get a massive traffic. (I am wrong?) CPU: i7-2600 Quadcore (3.4-3.8 Ghz) RAM: 16 GB HDD: 2x3 TB SATA (6 Gbit/s) - I think that disc performance of a dedicated is better then of Amazon EBS Traffic: 10 TiB in month included. This is what you get from Hetzner for $56(- 19% VAT) or $66 for EU residents. Please, tell me what's the reason to use Amazon? Which load won't a server from Hetzner take, but Amazon Auto Scaling will? The maintenance of dedicated vs EC2 is still the same? Or hardware failure at Amazon, won't ruin your EBS storage? I'm still not at the level when I need expensive hosting, but want to know beforehand, just to be sure if Amazon infrastructure is better then pure performance of Hetzner's hardware.

    Read the article

  • How to configure Google sitemap links in Wordpress? (without editing its HTML or PHP source code)

    - by Alexander Farber
    I run a Wordpress 3.7.1–de_DE site, but don't have much experience with it yet. When my site comes up in a Google search, there are 2 Google sitemap links displayed underneath: One of them points to a non-existent webpage /imprint though and I had to add a page at that URL to workaround this (and I want the URL actually be /impressum anyway since the site is in German and has German URLs). How to configure Google sitemap links in Wordpress (without editing its HTML or PHP source code)?

    Read the article

  • Do I need multiple accounts in Facebook for each of my product site?

    - by John
    I've a dozen sites which include for-profit ones as well as for charity. For each site I've created a Facebook company/charity account. After creating those accounts it dawned on me that I could as well have created a new page for each of my site from my personal account only even if a site has multiple product pages. What'll be the right strategy? Also as per Facebook terms we can have only single personal account. I do have single personal account only but for each site I've created only company pages. I hope I'm not violating the facebook terms.

    Read the article

  • How to smartly optimize ads on website

    - by YardenST
    I've a content website that presents ads. Now, my team want to optimize it for a better experience for the users. (we really believe our ads are good for our users.) We are sure that every website deals with this issue and there must be some known ways and methods to deal with it, that smart people thought of before. so what i'm looking is a tested, working method to optimize ads. for example: if i was asking about optimizing my website in Google, I would expect you to answer me: learn SEO if i was asking about optimizing the use of my website: usability testing. navigation: information architecture what is the field that deals with optimizing ads?

    Read the article

  • Whats a good host for an active vBulletin site?

    - by Kyle
    I've been switching hosts using a VPS each time and I'm just really not sure I'm finding the right VPS's. I've used a VPS from burst.net & rubyringtech and I just feel like it's slowly killing my site because of the slow speed. I really don't know if it's the network or the VPS itself but I really wish to fix this. When I TOP into the VPS peak times it shows this: top - 03:18:56 up 16:33, 1 user, load average: 1.33, 1.40, 1.33 Tasks: 30 total, 1 running, 29 sleeping, 0 stopped, 0 zombie Cpu(s): 27.2%us, 13.6%sy, 0.0%ni, 59.2%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 1048576k total, 679712k used, 368864k free, 0k buffers Swap: 0k total, 0k used, 0k free, 0k cached And pages take atleast a good 2-3 minutes to load. I have only like 50-60 members on the forum also. I had a shared hosting account and the forum was lightning fast.... Is a VPS a bad idea? :\ What should I do to fix this? I'm running lighttpd with xcache, and the latest mysql + php version. The server is a intel i7 2600 w/ 1gb uplink (I think the 1gb uplink is a lie because I've tested the network and the highest download speed I've seen was 20mb/s from a code.google page) All in all I've seen people talking about linode. Should I try them? I honestly don't need a dedicated server yet it's only 50-70 members online. What should I do? I really want a VPS because I enjoy root access. Does anyone have any suggestions?

    Read the article

  • Tricky mod_rewrite challenge

    - by And Finally
    I list about 9,000 records on my little site. At the moment I'm showing them with a dynamic page, like http://domain.com/records.php?id=019031 But I'd like to start using meaningful URLs like this one on Amazon http://www.amazon.co.uk/Library-Mythology-Oxford-Worlds-Classics/dp/0199536325 where the title string on the root level gets ignored and requests are redirected to the records.php page, which accepts the ID as usual. Does anybody know how I could achieve that with mod_rewrite? I'm wondering how I'd deal with requests to my other root-level pages, like http://domain.com/contact.php, that I don't want to redirect to the records page.

    Read the article

  • SEO problem for site with 2 domains [closed]

    - by Harry
    Possible Duplicate: What is duplicate content and how can I avoid being penalized for it on my site? I have two domains pointing to the same site. I want both domains to co-exist, they share most of the same content, but they differ in design and they are aimed at different markets / rivaling communities. Is there a way to let google know that these two domains are the same site and don't cause me to get hit with a duplicate content penalty? Any other general SEO tips for this situation would also be welcomed. Thanks. Come on man, why was this closed. The linked page is completely irrelevant for me.

    Read the article

  • How to pass information across domains to ask for newsletter only once?

    - by Michal Stefanow
    Lets assume following scenario, I have two sites: example1.com example2.com When user visits 1 there is a prompt "please signup to a newsletter". Same thing happens when user visits 2. However when navigating from 1 to 2 I don't want signup form to be shown. My first thought were 3rd-party cookies, but it seems that they are blocked / not working: http://stackoverflow.com/questions/4701922/how-does-facebook-set-cross-domain-cookies-for-iframes-on-canvas-pages?rq=1 http://stackoverflow.com/questions/172223/how-do-i-set-cookies-from-outside-domains-inside-iframes-in-safari?rq=1 Another thought is to append #noshow for each URL but that would require some work - for instance a script that would intercept click / tap events and modify URL structure depending on the address. (but that seems hacky) I wonder if you know a robust well-established solution to this issue? Thanks

    Read the article

  • Is SEO affected negatively by having densely encoded identifiers of content in URLs?

    - by casperOne
    This isn't about where to put the id of a piece of unique content in URLs, but more about densely packing the URL (or, does it just not matter). Take for example, a hypothetical post in a blog: http://tempuri.org/123456789/seo-friendly-title The ID that uniquely identifies this is 123456789. This corresponds to a look-up and is the direct key in the underlying data store. However, I could encode that in say, hexadecimal, like so: http://tempuri.org/75bcd15/seo-friendly-title And that would be shorter. One could take it even further and have more compact encodings; since URLs are case sensitive, one could imagine an encoding that uses numbers, lowercase and uppercase letters, for a base of 62 (26 upper case + 26 lower case + 10 digits): 0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz For a resulting URL of: http://tempuri.org/8M0kX/seo-friendly-title The question is, does densely packing the ID of the content (the requirement is that an ID is mandatory for look-ups) have a negative impact on SEO (and dare I ask, might it have any positive impact), or is it just not worth the time? Note that this is not for a URL shortening service, so saving space in the URL for browser limitation purposes is not an issue.

    Read the article

  • In Google Analytics, how can I determine the value of a page if no goals or revenue have been determined?

    - by Brandon Durham
    I have 4 years of data in Analytics with over 20 million pageviews for the entire site. No goals have ever been set up, and while the site is an ecommerce site, no ecommerce features in Google Analytics have ever been taken advantage of. So I have no way to determine what the actual value of a page is. I've been tasked with determining if a particular page on the site is worth keeping around. How might I use all standard data (pageviews, bounce rate, time on page, time on site, etc.) to help determine the value of this page? I really appreciate any help I can get!

    Read the article

< Previous Page | 211 212 213 214 215 216 217 218 219 220 221 222  | Next Page >