Search Results

Search found 18210 results on 729 pages for 'website promotion'.

Page 252/729 | < Previous Page | 248 249 250 251 252 253 254 255 256 257 258 259  | Next Page >

  • Blatant copyright theft

    - by Tom Gullen
    Found a user on the forum trying to solicit business for his website, a good user reported it and I checked the website out. Firstly and most dangerously it's attempting to sell our original software, which is open source. Our open source software is around 15mb big and he's serving a 50mb download and trying to sell it for $20. He's also stolen our CSS/images/site design in general which is all custom built. I attempted to open reasonable discussion with him, and he responded promptly saying he would remove offending materials if he could just have 3 days to sort it out which I accepted. I'm not sure what his plan was because everything on that site is offending material. Anyway he messaged back saying the site was offline, and it was, but it went back online shortly afterwards. It's pretty sickening that someone is selling open source work as their own, (the site about us page references him as the sole developer etc etc, it's unbelievable to read it). I want to shut it down, what are my options? I'm going to contact his domain registrar, web host, and Paypal (that's how he's selling the program). Any other ideas?

    Read the article

  • Windows 7 - XP Mode - Apache

    - by Howard
    I've setup Virtual PC and XP Mode on my Windows 7 Pro. Using Apache 2.0.52 I have no problems having my website up and running on the Windows 7 machine. But Under VPC/XP Mode the best I can do is Localhost mode. What do I need to do to enable http connections? I need the XP Mode as besides the website I also run a Web BBS and a Dos based (via telnet) BBS. Some of the apps in the Dos BBS just won't work under 64 bit, no matter what setting (capability) are used. Thanks in advance...

    Read the article

  • Can't access internet even though everything is working

    - by entity64
    A friend recently upgraded to a new cable internet connection. The modem connects to the router and various PCs and smartphones from her roommates connect to the router. They don't have any problem accessing the internet. She has Windows 8 and can't access any website (via wifi and ethernet). DNS (UDP) is working, DHCP set up everything correctly, Wifi is working, Trace routes and Pings (ICMP) go through with no problem at all. But neither Dropbox nor Skype nor Spotify nor any browser (all TCP) can access any website. The thing is though, she can connect through the university wifi and via a neighbors wifi. It's just her home connection. No firewalls are running and the computer is clean - no malware. How could it be that only her home connection won't work and others do?

    Read the article

  • Tracking form abandonment

    - by Alec Sanger
    I'm looking for a decent way to track form abandonment. Ideally, I would like to see how many people start filling out a form but do not complete it, as well as the last field that was filled out. The website is a fairly large Wordpress site with quite a few forms. Some of these forms are to register for events, some are for donations, some are for information requests. My first attempt at this was adding a generic jquery that bound functions to all forms on the site. When a form element was blurred, I would trigger a Google Analytics event with the name of the form, the name of the field, and whether or not it was filled. I expected to be able to go to the Event Flow section in Google Analytics and see the flow of these form events, however since there are so many forms and other events occurring on the website, Google wouldn't let me break them out very well. The other issue was the Quform doesn't name their fields anything relevant, and it doesn't look like we can name them ourselves. This results in a lot of ugly form names that don't mean anything without cross-referencing the actual form. Does anybody have any suggestions on how I can achieve more usable form abandonment metrics in a scenario like this?

    Read the article

  • How are suspected DoS attacks handled by webservers?

    - by Jan Kuboschek
    I rent a server somewhere out in Canada or so that I'm using to host a website of mine. That website has close to 400,000 pages that I wanted to index today. For that, I wrote a crawler a while back (see JCrawler on Stackoverflow.com). Now, I'm greedy and didn't want it to take too long so I ran multiple threads resulting in some 60+ requests per second from my IP. A couple minutes later, my server locked me out. I can still FTP into it, but I can't HTTP it. As server administrator or user, do you have any idea how servers usually handle these situations? Is it common to place a permanent or temporary ban on the IP or what is typically done? Naturally, I'll re-run my software with fewer requests once I'm back on.

    Read the article

  • Web Hosting Checklist

    - by Chris
    I am a web developer that is starting to look into hosting his own website. I would like to showcase my programming skills (PHP, MySQl, C#, Wordpress). My knowledge of languages I am OK with but the actually hosting site is where my knowledge starts to get a little shaky. I know the basics (bandwidth, sub-domains, re-write rules) but I would love your input, to help me formulate a check list of certain web-hosting services that I should be on the look-out for. Also I was wondering if there were any reliable hosting providers who give you the option to host both c# code-behinds and PHP code. As I would like to have two versions of my site, one in C# and one in PHP the hope is that if I need to look for another job this website will help me show possible employers my server side knowledge. I hope this is enough info, I did some researching online but found a bunch of unless articles and I've always have had luck on the StackExchange sites. So hopefully you, can help me. Thanks alot.

    Read the article

  • Using .htaccess, can you hide the true URL?

    - by Richard Muthwill
    So I have a web hotel with 1 main website http://www.myrootsite.com/ and a few websites in subdirectories, in a folder called projects. I have domain names pointing to the subdirectories, but when holding the mouse over a link in those websites the URLs are shown as: http://www.myrootsite.com/projects/mysubsite/contact.html When I'm on mysubsite.com I want them to be shown as: http://www.mysubsite.com/contact.html I spoke to support for the web hotel and the guy said try using .htaccess, but I'm not sure exactly how to do this. Thank you very much for your time! Edit: For more information My website is: http://www.example1.com/ and I also own http://www.example2.com/. All of example2.com's files are in: example1.com/projects/example2/. When you visit example2.com, you'll notice all of the URL's point towards: example1.com/projects/example2/ but I want them to point towards: example2.com/ Can this be done? I hope this is enough info for you to go on :). Edit: For w3d I go to the url mysubsite.com and the browser shows the url mysubsite.com. The services I'm using create an iframe around myrootsite.com and use the url mysubsite.com I just hate that in Firefox and Internet Explorer, holding the mouse over link show that the destination url is: myrootsite.com/projects/mysubsite/...

    Read the article

  • Why is Windows Update telling me to install Internet Explorer 10 even though it is already installed?

    - by gparyani
    I have a Windows 7 laptop. For some reason, Windows Update keeps telling me to install Internet Explorer 10 for Windows 7 even though it is already installed (downloaded from Microsoft website). Whenever I try and install it from Windows Update in order to get rid of it, I receive error 9C48. What is the problem and how do I solve it? (I've noticed several entries for similar problems online, but those are all for Internet Explorer 9). As I said in the comments, my system keeps trying to install the same update when it is shutting down, only failing with the same error. It is hogging my shut down button and wasting time during shutdown. Update: I just went and installed Internet Explorer 11 from the Windows website. However, Windows Update continues to offer me Internet Explorer 10. How can I tell it that the update is in fact installed so that it instead starts offering me updates?

    Read the article

  • web services access not being reached thru the web browser [closed]

    - by Tony
    I am trying to reference my .asmx webservices in .NET but my server is not exposed to the internet. When I put on the following address I get the message mentioned below. What's the reason for not being able to see the directory? Am I missing something in my IIS configuraction? Am I missing anything in my permissions? Just as reference I have other folders with webservices and I have the same issue. When I login to the server I am doing it with my windows user and password (I am using windows authentication). It's necessary to mention that when I put the URL I am getting a popup screen to put in my userid and password but it seems that's not able to validate since keeps asking me a couple of times. Let me know if you need more information to address this issue . http://appsvr02/Inetpub/wwwroot/DevWebApi/ Internet Explorer cannot display the webpage What you can try: It appears you are connected to the Internet, but you might want to try to reconnect to the Internet. Retype the address. Go back to the previous page. Most likely causes: •You are not connected to the Internet. •The website is encountering problems. •There might be a typing error in the address. More information This problem can be caused by a variety of issues, including: •Internet connectivity has been lost. •The website is temporarily unavailable. •The Domain Name Server (DNS) is not reachable. •The Domain Name Server (DNS) does not have a listing for the website's domain. •If this is an HTTPS (secure) address, click tools, click Internet Options, click Advanced, and check to be sure the SSL and TLS protocols are enabled under the security section. For offline users You can still view subscribed feeds and some recently viewed webpages. To view subscribed feeds 1.Click the Favorites Center button , click Feeds, and then click the feed you want to view. To view recently visited webpages (might not work on all pages) 1.Click Tools , and then click Work Offline. 2.Click the Favorites Center button , click History, and then click the page you want to view.

    Read the article

  • Good podcasting solution?

    - by burnso
    I simply ask for the best, most common and simple way to set up a podcast? Please, I need an answer so I can close this question? I run a joomla-based website for a small church and now need a simple, cheap and effective solution for an audio podcast. I am looking for a solution that will do the following: User uploads audio files to service (preferably not to our own site) that is cheap, fast and simple to use. Dropbox for example? Files are easily embeddable into Joomla website articles to be played on the spot (through simple-to-use and media player). Preferably through RSS feeds (to make it easy to update every week). Files are downloadable. Files are viewable on iPhones and other smartphones. Solution can be broadcasted via iTunes. Solution doesn't need a lot of extra, new third party software. I'd rather keep it simple and familiar than have it be a complete new system but with a steep learning curve. At the moment we use vimeo to host the podcast, through video files. But we'd like to move to something simpler that doesn't involve a series of difficult steps to upload the files to the web.

    Read the article

  • Accessing localhost:8080 through local network

    - by Theron Luhn
    I'm developing a Python WSGI website. I'm running a Paste development server on my Mac (OS X 10.7) on port 8080. I want to test the website on some other devices and OSs I have connected to the local network (Windows 7 VM, iPad, iPhone, etc.), but am having trouble. I turned on Web Sharing, and am able to access that (port 80) without a problem on all my devices. Port 8080 still doesn't work. An excerpt from my Paste configuration: [server:main] use = egg:waitress#main host = 127.0.0.1 port = 8080 The OS X firewall (Settings - Security - Firewall) is off. I have no other firewall software installed. My network is through a Linksys WRT160N router. I haven't done much with the settings, so most of them are at their defaults. I've been Googling all morning, but can't find a solution.

    Read the article

  • Where can I learn about managing domain names for my websites? [closed]

    - by Shahbaz
    [I originally asked this question on serverfault.com, where it was closed as 'out of scope.' Hopefully it is appropriate for this forum] I am a developer who doesn't understand how to effectively manage Internet domain names. Say I registered a name with namecheap and host a website on linode. Now what is an a-record? What is a name server and do I host it with namecheap of linode? Why would I pay amazon when others are free? Does any of this matter in terms of website latency or reliability? I feel like a script kiddy, copying and pasting others' and hoping it works. Is there a book or other resource that explains all this? I know amazon is full of books about DNS, but afaik they are about setting up DNA servers for local networks, not the Internet. p.s. To emphasize, I'm asking for books or long write-ups which explain this to technically competent people, who just haven't had to think about the role of commercial registrars, name servers, commercial hosts, commercial websites and how all parts play together on the real internet (not local networks).

    Read the article

  • Location-Based redirection and duplication in sub-directories affecting SEO

    - by Joshua
    I currently own the website www.xyz.com. The website has a sub-directory for each of the 3 target countries: .../en-US/ (United States), .../es-MX/ (Mexico), and .../es-DO/ (Dominican Republic). I have two main questions about this setup: Currently, the main domain/root (xyz.com) contains a blank index.php file, but I would like for a user to be redirected to one of the sub-directories based on their regional location. What is the best way to accomplish this? I have looked at using browser language-based redirection, but how would I know whether to direct a user to the MX or DO site if the browser language is set to spanish? Is there a way to detect a user's geographic location? Also, the 3 websites are practically identical except they all have 3 unique color schemes and the US site is in english while the MX and DO sites are in spanish. My problem is that I believe GoogleBot is penalizing/banning my site because the spanish text on the MX and DO pages are nearly identical and are thus marked as duplicates/spam. Is there a way to avoid this?

    Read the article

  • Is the decision to use SNI or IP based SSL made during cert purchase or cert installation?

    - by Neil Thompson
    It's time to renew an SSL cert - but the website will soon be moving from a dedicated machine with a fixed IP to a cloud based host behind a load balancer. When I renew or re-purchase my ssl cert do I make the decision about whether it should be an SNI / IP based SSL Cert at the point of purchase - or is a cert a cert and it's all about where and how it's installed? I'm hoping the renewed cert can continue to be IP based for now, and in a few months when the website (and it's domain ofc) moves to the cloud I can re-use the cert in 'SNI mode'

    Read the article

  • Changing Windows 'hosts' file in guest OS under Parallels Desktop 6 on Mac [migrated]

    - by ling
    I am running Win7 in a Parallels Desktop 6 on Mac. On the mac, I use /etc/hosts : 127.0.0.1 mysite and it works fine : I can type mysite in the url and the website will display. What I want to do is be able to reproduce the same on Win7 with IE8 for example. Another guy did it successfully here but I can't : Parallels: How to see a Mac-hosted website from Windows? On Mac : ifconfig -a vnic0 10.211.55.2 on Win7 I tried this C:\Windows\System32\drivers\etc\hosts 10.211.55.2 mysite What am I missing ?

    Read the article

  • nginx: URL rewrites and performance

    - by j0nes
    I have a website where I need to change the URL structure. The old URLs look like /olddir/part1_de.htm, the new ones will look like /newdir/sub/category/anotherpage.htm. There are a lot of URL rewrites I need to do, I assume about 500 distinct rewrites in the end. As my website gets quite a lot of traffic, my main concern is about performance at the moment. My questions are: I assume that for each request, the rewrites block will be parsed and the regex will be evaluated. Am I right? Will there be a performance penalty if I use these rewrites? Can nginx handle this? Are there any "best practices" to follow when doing a lot of rewrites?

    Read the article

  • Multiple domains for different products?

    - by alexandertr
    I have a website with software applications. Is it good for SEO to choose one keyword rich domain name for each of our software products or should we stick to a single domain? From a user's perspective I think it would be easier to remember a domain that is keyword rich as the user will instantly know what this product is for. But I have read articles that the latest trend in SEO is to stick to one domain for all of your products and invest on this single domain website. Is that true? What do you advise? Should I register a separate domain for each of our products or should I use only one single domain? Should I do a 301 redirect with a .htaccess to a single domain? And what about the sitemaps? Should I register all sites in Google Webmaster Tools and post a separate sitemap for each one of them? should my main site sitemap include all pages or should separate domains have their own sitemaps?

    Read the article

  • Artists and music - Need Help Deciding on a CMS

    - by infty
    A friend has asked me to build a site with the following options: staff members must be able to add new music and artists to the page a gallery must be provided - it is also good if each artist has the ability to have his/her own smaller gallery users must be able to vote for artists users must be able to alter in discussions (forums or comments sections) staff members must be able to blog staff members must be able to write articles I did a small project where i actually implemented all of these features, but I want to use an existing content management system for all of these features so that future developers can, hopefully, more easy extend the website. And also, so that I don't have to provide too much documentation. I have never developed a website using an external CMS like Drupal or Wordpress and after seeing hours of tutorial videos of both systems, I still can't make up my mind on whether i should : a) use Drupal 7 b) use Wordpress 3 c) create my own cms I can imagine that staff members would also want to create content using iPhone or android based mobile devices, but this is not a required feature. Can someone, with experience, please tell me about their experiences with larger projects like this? The site will have approximately 400 000 - 500 000 visitors (not daily visitors, based on numbers from last year in a period of 4 months)

    Read the article

  • IIRF - Redirecting all traffic to the http equivalent

    - by GordonB
    I'm using IIRF and having some trouble getting it to redirect all traffic to the secure version of my sites. So... I have a website with about 20 apps in virtual directories in IIS6. The website takes 80 and 443 traffic. I want to use IIRF to redirect all port 80 traffic EG; http://myserver/app1/page1/param1 http://myserver/app2/ http://myserver To the secure equivalent (https). Here's my config so far; # Iirf.ini # # ini file for IIRF # RewriteLogLevel 1 RewriteLog D:\Websites\Apptemetry\IirfLogs RewriteEngine ON StatusInquiry ON IterationLimit 5 RewriteLogLevel 3 RewriteCond %{HTTPS} off RewriteCond %{SERVER_PORT} ^80$ RedirectRule ^http(.*)$ https$1 Can anyone advise the correct configuration to use, to redirect all traffic?

    Read the article

  • Can't get IIS6 to setup properly.

    - by AngryHacker
    I am trying to setup a website on a Windows 2003 R2 box. So I cranked up IIS, clicked on Web Sites, created a new website and setup the host headers for my domain. No matter what I did, I kept getting Service Unavailable errors. Finally, I followed the instructions in this link, and it worked. Basically, the page recommends switching to IIS5 isolation mode. I don't really understand why it works and I don't understand why it didn't work before. If someone could enlighten me, that would be great. Bonus for explaining how to do with the IIS6 isolation mode.

    Read the article

  • How to move mail accounts when migrating webhosting

    - by pkswatch
    I am migrating my website abc.com from one webhosting company to another in a shared hosting environment. Both have cpanel. And the second hosting account i am preparing to move is my multi-domain hosting account with 3 domains already in it. The problem is, i have many email accounts associated with my website abc.com, which are accessed using webmail. So if i move it to the other host, will i lose all those accounts and their emails? If yes, then how should i synchronise the email accounts so that all the accounts and the contained emails remain intact? I saw some several sync tools like IMAP Sync, etc. But these require two hosts while synchronizing, and as you see, i have just one domain name to be synchronized over 2 servers. PS, i do not have any ssh access on either of them, and i have made complete backup of all files using backup wizard in cpanel.

    Read the article

  • Google Analytics custom variables and how are they recorded?

    - by mrtsherman
    I have been asked to add GA custom variable tracking to my company's website. The company website uses server side includes, so making modifications to the tracking code happens identically everywhere. Maintenance is therefore a headache. Also, GA takes about twenty-four hours for custom variables to start showing up in reports and that makes troubleshooting a headache. So if you have custom variables // visitor level tracking, id = 12345 _gaq.push(['_setCustomVar', 1, 'id', '12345', 1]); // page level tracking, email = [email protected] _gaq.push(['_setCustomVar', 1, 'email', '[email protected]', 1]); The marketing people want the following out of this: User visits site and we record a unique id for them. Whenever they return this id will be used in GA. User signs up for our newsletter on page X and we record their email address. Whenever they return this email address is used in GA. Now a big problem for me is that I don't use GA and the marketing people don't use custom variables. So we don't actually know how this will work. Do I want Page, Session or Visitor level tracking? What happens because the same GA code is used on every page? If they visit the email sign up form and we record the email address, but then they go somewhere else where email is nonexistent will the value get 'overwritten.' Sorry for the long question, but there are a lot of unknowns for a GA noob.

    Read the article

  • How to emulate a domain name - Webmin Setup

    - by theonlylos
    I am currently working on a client project where they are using a custom CMS which relies on having the specific domains configured for it to work properly. So in English, that means that when I try running the site on my test environment, the entire website fails because it isn't located on the primary domain (and I'm pretty sure the domain is hard coded since there's no control panel to adjust the file locations). Anyway what I wanted to ask is whether it is possible to use my test environment URL but have Apache and the DNS emulate my clients website URL locally, rather than calling the actual name servers. Right now I have a virtual host setup in Apache but I am not sure where to go from there. Any assistance is greatly appreciated.

    Read the article

  • Is sending data to a server via a script tag an outdated paradigm?

    - by KingOfHypocrites
    I inherited some old javascript code for a website tracker that submits data to the server using a script url: var src = "http://domain.zzz/log/method?value1=x&value2=x" var e = document.createElement('script'); e.src = src; I guess the idea was that cross domain requests didn't haven't to be enabled perhaps. Also it was written back in 2005. I'm not sure how well XmlHttpRequests were supported at the time. Anyone could stick this on their website and send data to our server for logging and it ideally would work in most any browser with javascript. The main limitation is all the server can do is send back javascript code and each request has to wait for a response from the server (in the form of a generic acknowledgement javascript method call) to know it was received, then it sends the next. I can't find anyone doing this online or any metrics as to whether this faster or more secure than XmlHttpRequests. I don't know if this is just an old way of doing things or it's still the best way to send data to the server when you are mostly trying to send data one way and you need the best performance possible. So in summary is sending data via a script tag an outdated paradigm? Should I abandon in favor of using XmlHttpRequsts?

    Read the article

  • Transferring whole folders using Dropbox without the desktop app

    - by Ender
    Is there any way I can upload whole folders to my Dropbox account without the desktop application? I am syncing my home computer to a computer at university and the university computers do not allow users to install applications anywhere, meaning that I cannot use the Dropbox utility to share my folders. I've attempted to use the online utility (on the website) to share my files, but much of my work is spanned over dozens of files all within their own folders. All you can do with the upload feature on the website is upload files and for my projects this would take a long time.

    Read the article

< Previous Page | 248 249 250 251 252 253 254 255 256 257 258 259  | Next Page >