Search Results

Search found 18210 results on 729 pages for 'website promotion'.

Page 139/729 | < Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >

  • How can I remove the security/malicious user warning from my website?

    - by BigBoy1337
    I have a domain name tradespring.net, and www.tradespring.net that redirect to my heroku app with a CNAME record. However when I first try to access these sites it gives me a malicious warning This is probably not the site you are looking for! blah blah blah then "proceed anyways" or "back to safety" Its because my browser realizes that it is redirecting. How can I make sure anyones browser (not just my browser) trusts this site and my heroku app? I dont think i need an SSL certificate because this site is not sending sensitive info (credit card info, ect.).

    Read the article

  • how to edit source files and commit the changes to the new website?

    - by ajsie
    i've got ubuntu installed with lamp. im using webdav to upload/download files to/from the ubuntu web server, after i have edited the php source files in netbeans. however, i wonder what is best practice for editing source files and committing these changes to the new website. cause if we are 2-3 developers, i guess we have to use svn. but i have never used it before so i wonder how it works. should i install it and then select the /var/www (apaches webroot) as the repository folder? then when i check in, all the changes will apply immediately? could someone please explain following steps: how to download, edit the source files, upload the files and see the new changes in the website. cause i have only worked with a local apache before, and it was only me. now there will be some more programmers so i have to set up a decent, central environment for this, and have to know how netbeans, svn, webdav and apache works all together. thanks!

    Read the article

  • template for terms of condition for social media based website?

    - by Rubytastic
    Im looking for a template for a terms of usage text based on social media websites. Im actually a coder and not into the legal blabla in general. Ofcourse you could spend a thousand or 2 on a lawyer but just a 3/4 paper text shoulder;t be to hard to compile yourself with some help. Im not sure if this is the right spot to ask this question but I love stack overflow and none of the sites in stack exchange I could find matched better then this one. My first idea lets look at some social media websites and grab some of there text, rewrite it for own specific usage Are there templates on writing such document Same goes with a privacy policy actually.

    Read the article

  • Cannot access personal website from home IP. More details inside.

    - by GX67
    This is a recent problem I've been having. My site can be accessed from almost everywhere else except from my home IP, where I do most of my editing/updating, etc. I've tested my connection from my school's network, a friend's connection from out of state (multiple states), and through a tethered connection with my friend's Android. It works in all those cases, both viewing, accessing the cPanel, and using FTP. Here's the problem that happens to me when I try to view it from my home IP: The page times out in Firefox, IE, and Chrome. Using the cmd, I ran tracert and ping, both as failed attempts. Log here. downforeveryoneorjustme.com says my site is up. So do the other site checkers. I can't access my cPanel or FTP accounts. I can't access the host site. (I use perfectz.info for hosting, and I can't access their site either.) System settings: No firewall enabled. Ports are seemingly properly forwarded. (e.g. The ports are open in the router settings, and are open everywhere else.) I have an email forwarder set up from the cPanel that works just fine. (i.e. I can receive emails sent to that address. If any other information is needed, I'll do my best to provide it. UPDATE @ilhan: I use two things: 1) The site cPanel from in-browser. 2) Dreamweaver CS5 FTP. @Matthias: I tested both, and it passes the dual stack with a 10/10. What should I do then?

    Read the article

  • What dangers await if I block non-standard, non-major-usa search engine bots from my USA only website?

    - by Ryan
    I noticed tons of bandwidth being used by non-USA search engine bots, so I began blocking them in an effort to save bandwidth and cpu cycles for actual users and the search engines they come from (Google, Bing, Yahoo, Ask, etc.). Other than potentially losing some international traffic (which isn't really important to us since all of our content is very USA-centric), what additional dangers should I be concerned about? I'm using a modified version of Jeff Starr's User Agent Blocklist

    Read the article

  • What is this chargeback scam from eBooks bought on my website?

    - by Dan Friedman
    We have a scammer that is buying our e-Books and then performing chargebacks. Our e-Books don't have DRM, so if they wanted to resell them, they would only need to buy each book once. But instead, they keep buying the same books over and over again and then performing hundreds of chargebacks. We have created some additional rules in our fraud protection tools to block certain aspects, even though all the info looks legit, and are hopeful this will slow them down. But my question is: What is the scam? If they aren't getting any product and they only get chargebacks for something they already purchased, then they can't get additional money from the credit card company, so then what's their motivation?

    Read the article

  • How many developers do I need to build a website like Freelancer.com in about 3-5 months? [closed]

    - by Sam
    I have been asked to make a list of people that I need to build something similar to freelancer.com. Not exactly same, has a few more features to it too but I can't really get my head around the whole freelancer.com site. I have built a social networking site from scratch which is 70% of Facebook and 20% Google+ in about 5 months with raw PHP, JS, CSS and Ajax. I dont think it will take me more than a month or something to build the whole freelancer.com from scratch. Please suggest anything that should I pay attention to. I am thinking about: 2 php developers 1 mysql engineer 1 network/server engineer 1 graphics artist 1 UI developer Time frame: 20 days Is this a good estimation?

    Read the article

  • Should I index my mobile duplicate of my desktop website on google?

    - by Roy
    I have a duplicate of http://he.thenamestork.com with the url http://he.thenamestork.com/mobile - all files are duplicated while the mobile version has a slightly different content. Notice that when I write 'mobile' I only mean regular HTML4 with smartphone friendly CSS. I have a series of redirects (using .htaccess) that allows smartphone users land directly on the mobile versions. But I wonder, shound I index the mobile version as well so those users will be able to get direct, faster links? And what is the proper way of doing that without causing problem in google search? I guess I'm asking if there's a way to get google display regular urls for desktop users and ../mobile/.. urls for smartphone users, and if it is smart SEOwise.

    Read the article

  • What is the best way to pass data into an API from a website?

    - by Chris Wakoksi
    My main objective is to create a User Interface in order to provision(change the status: activate, deactivate, or suspend) modems. I have received a SOAP developer's guide to the specific type of modem that helps out with the specific values that I need to pass into Iridium's WSDL but I cannot figure out how to do this from a webpage I have created using HTML/CSS, and PHP? I have been told by others that I need to know the portal. I have no idea what that means. I am a beginner programmer who has learned programming for this project. Any suggestions as to how to pass my data into the provisioning API?

    Read the article

  • Why am I getting domainpark.cgi being called from my website?

    - by Sean
    I used to test my site on www.exampleone.com and now I have moved to the real domain www.realdomain.com now and www.exampleone.com is now parked by 1and1 (default). Now when I test to see which requests are made by the www.realdomain.comI see domainpark.cgi and park.js from Sedo Parking also being requested as well as the js that serves the ads by adclicks. How do I get rid of this? It's not on the index page at all, and it's causing a lot of strain and slowing my site down.

    Read the article

  • How can I implement an escrow payment system in my website?

    - by BeachRunnerJoe
    Hello. I'd like to build a web service similar to kickstarter that allows users to pledge money to an idea, tho I'm unsure how I can implement this kind of payment system. If the the idea receives a specified amount of money, then the donors are charged. If it doesn't, the donors are not charged. I've done some preliminary research and have found Amazon Payments to be a possible solution provider for this, but I'm still unsure where to start with this and was hoping someone could point me in some right directions for how I can go about implementing this kind of payment structure in my web site. I should also note that this is primarily a prototype I'm building, so it's ok if the solution is limited to U.S. customers only. Also, I plan to build the site using Ruby on Rails. Thanks so much for your wisdom!

    Read the article

  • webhost4life, please give me, my data back. My website will not work without the database.

    - by Shervin Shakibi
    I have about 4 or 5 accounts with WebHost4life.com, these are all my customers that based on my recommendation have been hosting with webhost4life.com. A few days ago for some reason they decided to migrate one of these accounts to a new server. They moved everything created a new database on the new server but the new database is empty. after spending hours with Tech support they acknowledged the problem and assured me it will take up to an hour or two and my database will be populated with the data. this was about 7 hours ago. Oh by the way I pay extra for the backup plan and yes you guessed it, none of my backups are there. Needless to say I’m very scared and disappointed. No one is responding to my emails  or phone calls. After searching the web, I found out, this has happened before, in some cases it took them days to fix the problem and many never got it resolved and switched hosting companies, I would love to do that but I need my 2 GB database before I start shopping around for a new hosting company. Stay away from Webhost4life.

    Read the article

  • How can I optimize Apache to use 1GB of RAM on my website? [closed]

    - by Markon
    My VPS plan gives me 1GB of RAM burstable to 2GB. Of course I cannot use 2 GB, nor 1 GB, everyday, so I'm planning to optimize the performance of my webserver. The average of hits-per-hour is about 8'000-10'000. This means about 2 connections-per-second. Max hits-per-hour reached until now is about 60'000. That means about 16 connections-per-second. Unluckily my current apache configuration uses too much memory (when there are not connected clients - usually during the night - it uses about 1GB) so I've tried to customize the apache installation to fit to my needs. I'm using Ubuntu, kernel 2.6.18, with apache2-mpm-worker, since I've read it requires less memory, and fcgid ( + PHP). This is my /etc/apache2/apache2.conf: Timeout 45 KeepAlive on MaxKeepAliveRequests 100 KeepAliveTimeout 10 <IfModule mpm_worker_module> StartServer 2 MinSpareThreads 25 MaxSpareThreads 75 MaxClients 100 MaxRequestsPerChild 0 </IfModule> This is the output of ps aux: www-data 9547 0.0 0.3 423828 7268 ? Sl 20:09 0:00 /usr/sbin/apache2 -k start root 17714 0.0 0.1 76496 3712 ? Ss Feb05 0:00 /usr/sbin/apache2 -k start www-data 17716 0.0 0.0 75560 2048 ? S Feb05 0:00 /usr/sbin/apache2 -k start www-data 17746 0.0 0.1 76228 2384 ? S Feb05 0:00 /usr/sbin/apache2 -k start www-data 20126 0.0 0.3 424852 7588 ? Sl 19:24 0:02 /usr/sbin/apache2 -k start www-data 24260 0.0 0.3 424852 7580 ? Sl 19:42 0:01 /usr/sbin/apache2 -k start while this is ps aux for php5: www-data 7461 2.9 2.2 142172 47048 ? S 19:39 1:39 /usr/lib/cgi-bin/php5 www-data 23845 1.3 1.7 135744 35948 ? S 20:17 0:15 /usr/lib/cgi-bin/php5 www-data 23900 2.0 1.7 136692 36760 ? S 20:17 0:22 /usr/lib/cgi-bin/php5 www-data 27907 2.0 2.0 142272 43432 ? S 20:00 0:43 /usr/lib/cgi-bin/php5 www-data 27909 2.5 1.9 138092 40036 ? S 20:00 0:53 /usr/lib/cgi-bin/php5 www-data 27993 2.4 2.2 142336 47192 ? S 20:01 0:50 /usr/lib/cgi-bin/php5 www-data 27999 1.8 1.4 135932 31100 ? S 20:01 0:38 /usr/lib/cgi-bin/php5 www-data 28230 2.6 1.9 143436 39956 ? S 20:01 0:54 /usr/lib/cgi-bin/php5 www-data 30708 3.1 2.2 142508 46528 ? S 19:44 1:38 /usr/lib/cgi-bin/php5 As you can see it use a lot of memory. How can I reduce it to fit to just 1GB of RAM? PS: I also think about the switch to nginx, if Apache can't fit to my needs...

    Read the article

  • Should the English website use href="x-default" when it doesn't auto-redirect to the user's language or country?

    - by Noam
    For each URL on my site, I'm auto-redirecting according to header accept language. The site arch is English version: http://mydomain.com/page Spanish version http://es.mydomaina.com/page etc.. The english version is displayed unless I'm seeing a specific language other than en and that I support in the header, and then a redirect occurs. Google says this: For language/country selectors or auto-redirecting homepages, you should add an annotation for the hreflang value "x-default" as well: My pages aren't language selectors, nor are they the homepage. But I am auto-redirecting. My question is, should my english version be hreflang="x-default" or/and hrefland="en"?

    Read the article

  • If a blogger writes a whole article about my website, how important are anchor texts?

    - by Noam
    If there is a full article about my web-service, with my brand name in the title, and many relevant keywords that I would like Google to consider in my rankings, and links to my web-site with simple anchor text such as <brand name> and <page title>. Does it make a big difference if I get links to the actual keywords I'm after, or is it enough that these keywords are part of the written text?

    Read the article

  • Can Mailchimp APIs be used to send templated transaction email triggered by actions on my website?

    - by HenryW
    I am currently playing around with Mailchimp's APIs, but the documentation to me is not very clear. Here is what I actually want: Have the templates I created on Mailchimp, be visible on my own server. Assign each template I made to a specific action (logged in,subscribed, created order, or new password). This is functionality that I already tested with Mandrill, but the template exists on mandrill's account. If option 1 is not possible, can I still make my own template in my own environment, and send that template out over Mailchimp or Mandrill? Should I use Mailchimps services for this or send the email directly from my own server? Curent used function: function tep_mandrill_mail($to_name, $to_email_address, $email_subject, $email_text, $from_email_name, $from_email_address) { if (SEND_EMAILS != 'true') return false; $uri = 'https://mandrillapp.com/api/1.0/messages/send-template.json'; $postString = '{ "key": "xxxxxxxxxxx", "template_name": "sometemplatename", "template_content": [ { "name": "header", "content": "*|HEADERSTUFF|*" }, { "name": "main", "content": "*|CONTENTSTUFF|*" }, { "name": "footer", "content": "*|FOOTERSTUFF|*" } ], "message": { "subject": "'.$email_subject.'", "from_email": "'.$from_email_adress.'", "from_name": "'.$from_email_name.'", "to": [ { "email": "'.$to_email_address.'", "name": "'.$to_name.'" } ], "important": false, "track_opens": true, "merge": true, "merge_vars": [ { "rcpt": "'.$to_email_address.'", "vars": [ { "name": "HEADERSTUFF", "content": "'.$email_subject.'" }, { "name": "CONTENTSTUFF", "content": "'.$email_text.'" }, { "name": "FOOTERSTUFF", "content": "paulvale-foot" } ] } ], "tags": [ "password_forgotten" ] }, "async": false, "ip_pool": "Main Pool" }'; $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $uri); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true ); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true ); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_POSTFIELDS, $postString); curl_exec($ch); }

    Read the article

  • is it ok to have 2 sitemaps on 1 website?

    - by user615041
    Do I have to have a sitemap page on my index page for bots to read it or can I just have it anywhere on my server? I have a phpbb/wordpress integration and I need 2 sitemaps mods for each one (or I need to have them somehow integrated together into one xml sitemap). Is this possible? Whats my best option? I would have the phpbb one something like this: http://www.example.com/phpbb/sitemap.html and the wordpress one something like this: http://www.example.com/wordpress/sitemap.html and then I would submit both off..but not have the links on my footer to confuse anyone.., the sitemaps would strictly be for search engines. Is this a good idea? what are you thoughts?

    Read the article

  • What's the canonical process for backing up a website?

    - by Walkerneo
    This is going to sound terrible, but bear with me. I currently have a cron job that does a mysql dump, a git add all and commit, and a git push to bitbucket. I set this up almost a year ago, when I didn't know much about git, backups, and general web development and administration. I haven't had the time to fix this and do it properly, but the repo has now grown quite big from accumulating large temporary files from my forum, so now I have to do something and I want to do it properly this time around. What processes do semi-large websites and personal site admins use for backing up server content? Based on what I've learned since I set this up, what I'm currently think of doing is: Making changes on a development domain and committing the code frequently Archiving the entire site after a successful deployment from the development domain Having automatic daily database and user-content backups. I still like the idea of backing up sqldumps with git, though. I know git isn't a backup tool and that this is beyond its purpose, but the textual queries that are exported would be easily managed by git and would save a lot of space in archives.

    Read the article

  • Drupal Blog vs. WordPress Blog for a Drupal Website? [closed]

    - by Norma Riter
    Is there a blog of preference for SEO, when it comes down to Drupal websites. I ask as WordPress seems to have the better plug-ins, though may not integrate as well. Any thoughts on this? I am asking from primarily a SEO perspective though also a design one as well. In other words, there are so many fabulous blog templates in WordPress and not sure if there are in Drupal. I seem to be having a struggle finding Drupal blogs to purchase, such as premium blogs.

    Read the article

  • Amazon Careers website - are resumes processed in plain text format only?

    - by sapphiremirage
    The submission site has the following options: "Please upload your resume (Word Document, max size: 512 KB) OR Please copy and paste the text version of your file here", with a text box below the latter option. I went ahead and uploaded my shiny LaTeX resume (as a PDF), despite the fact that they seem to want a Word Document, and there didn't seem to be any issues. However, when I went back to edit my profile, there was no evidence that my PDF had been uploaded, other than a text version of my resume, awfully formatted and clearly stripped from the PDF, sitting in the text box below "Please copy and paste the text version of your file here". Exasperated, I did a quick and dirty copy of the text from my resume into a Word doc and uploaded that. Same result: no evidence of a file uploaded, just a stripped text version in the textbox. What I'm wondering now is, are they only going to look at the text version of my resume? If that's the case then I'm obviously going to edit it so that it looks halfway decent and doesn't contain such atrocities from the conversion as "Other Skills: LTEX". I can pretty little text files without too much effort, so this isn't that big of deal. However, my LaTeX resume is going to look better than anything I can do in plain text, so if the site is actually keeping a copy of that, then I certainly don't want to override it. Has anyone here either gone through the Amazon hiring process or interviewed candidates and know how this works? (i.e. When on site with Amazon, did the interviewers have diversely formatted resumes, or did they all look suspiciously similar)

    Read the article

  • Is there a file browser like an ER diagram for viewing website physical path structures?

    - by EASI
    I wish to know if there is a tool like this to replace windows explorer and similar in the work of managing files, specially those web-site structures that sometimes can get very big. I am asking it here because I sincerely do not know what terms to use in Google to find anything like it. It is like a diagram, but showing the tree directory structure and the contents of every folder at the same time, not just the one that is in focus.

    Read the article

  • Is this the correct way to implement .NET MVC website structure?

    - by aspdotnetuser
    I have recently seen a .NET MVC solution in which the markup in the .aspx views have a Controller as their model, and the .ascx user controls they contain use a separate model. I'm new to MVC and I wanted to find out about a few things I'm not clear on. An example of how the code is implemented: UserDetails.aspx view has markup that shows it's using the UserDetailsController.cs as the model. It contains RenderPartial("User_Details.ascx", UserDetailsModel) and passes it the UserDetailsModel. Is this the standard/correct way of implementing MVC? Or just one way to implement it? I also noticed that the classes used as Models appear to be Service classes that have [DataMember] and [DataContract] attributes on the class name and properties - what is the advantage of this implementation?

    Read the article

< Previous Page | 135 136 137 138 139 140 141 142 143 144 145 146  | Next Page >