Search Results

Search found 17124 results on 685 pages for 'final cut pro'.

Page 219/685 | < Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >

  • What is the proper SEO way to add and remove links to our site and sitemap?

    - by ElHaix
    As content fluctuates on our site, we will obviously add the titles to the new pages to our sitemap, and link list for search engine indexing. Through time, certain links will become less relevant, and would like to know how to avoid them being crawled. The links themselves may not be removed - but we don't want to dilute the link list with less relevant links as time passes. I'm guessing the status code of the page would change - to what? Should they also removed from the sitemap?

    Read the article

  • Covering Yourself For Copyrighted Materials [on hold]

    - by user3177012
    I was thinking about developing a small community website where people of a certain profession can register and post their own blogs (Which includes an optional photo). I then got to thinking about how people might use this and the fact that if they are given the option to add a photo, they might be likely to use one that they simply find on Google, another social network or even an existing online blog/magazine article. So how do I cover myself from getting a fine slapped on me and to make it purely the fault of the individual uploader? I plan on having an option where the user can credit a photo by typing in the original photographers name & web link (optional) and to make them tick a check box stating that the post is their own content and that they have permission to use any images but is that enough to cover myself? How do other sites do it?

    Read the article

  • Joomla: Prevent certain user from visiting a certain page

    - by user6657
    I have an internal section in my Joomla only for "Registered" users. There they can edit preferences. Multiple users can access the same preferences (different stakeholders). Now I have "smithcorp" and "smithcorp_guest". They should both be able to view the same things, but "smithcorp_guest" shouldn't be able to edit the preferences. Is there a way to forbid "smithcorp_guest" access to this page? I have only seen access regulation via the user level, i.e. unregistered, registered, admin. Thanks, MrB

    Read the article

  • Meta Refresh for change of page name and content

    - by user3507399
    Hopefully just a quick one. I've got a client that is changing the name of a workshop that they run. This means a change of url, page title for keywords that they have first page ranking on. The keywords are still relevant so what I want to avoid is a 301 redirect to a page that has different keywords to the previous page. Is the best option to keep the old page live with url and title and use a meta refresh to redirect after a period of time (not instant)? That way the SEO ranking is retained for the previous workshop name while they work on the ranking for the name change? Would a 301 redirect have an inverse effect? Thanks!

    Read the article

  • Will removing unused query string parameters negatively affect SEO?

    - by trm
    Will changing links to remove query string parameters that are no longer used have any negative impact on search engine rankings? Say I have a page about.php on my site, and all of my links to this page are of the form http://www.example.com/about.php?foo=bar and I've made some changes to the script such that the parameter foo is no longer used. I would like to remove the unused parameter from the links so the URL will look cleaner, but I am concerned that this could cause problems with SEO. Is it safe to remove ?foo=bar from my links?

    Read the article

  • Dedicated server: managed hosting or manage it myself?

    - by ddawber
    We're currently hosting a number of sites on a self-managed dedicated server. Some companies, however, offer a managed dedicated server hosting service. They offer: Roughly the same server spec Ticketing system support Managed daily backups Virtual firewall (but with a limit of 10 IP addresses allowed through at any one time) Now, this managed hosting is at extra expense - somewhere in the region of $500 per month, and the limit on the number of IP addresses they'll manage on the firewall is also a real pain. My thinking is it would be better and cheaper to Stay with the same host since the dedicated box is fine Get an Amazon AWS account and use their server to manage backups; there are a number of good tools that can be used to automate the process Configure iptables so that I have complete control of the firewall I want to know Is a managed virtual firewall likely to be more secure than me configuring iptables? Whether, in your opinion, it's best to let someone else take care of backups? If, from your experience, there's anything else i'm missing that warrants using managed hosting over a DIY service? I think there is some reluctance to not having managed hosting since a managed host in effect takes responsibility for your server, whereas any hardware or security issues with a server that we manage would mean we are forced to hold our hands up when a client site goes down. That said, I personally don't think a managed host does that much in the day to day running of your server (backups are automatic, OS updates are carried out with ease, etc.).

    Read the article

  • Is my project a web site or a mobile app?

    - by Evik James
    I am a web developer. I have been developing web sites professionally for 15+ years using ColdFusion, SQL Server, and jQuery. I am doing a project for a client that uses the above technologies. The site has a mobile facet too. For that, I am using jQuery Mobile. The site enables retail store personnel to gather a store customer's preferences using any smart phone or tablet. The store personnel just needs to access a special link via a QR code and a login. Anyone can easily access the site from a PC's browser, too. Some sources suggest that mobile apps must be downloaded and installed using a third party, such as from Google, Amazon, or Apple. Others sources suggest that any information designed for use on mobile device is a mobile app. Regarding the site that is specifically designed for use by mobile devices that extensively uses jQuery Mobile, is this a "web site" or a "mobile app"? What is the "proper" description for this type of site? My customer insists that it is one and not the other. I insist that it is the other and not the one. Can you help me clarify this?

    Read the article

  • What to use for "localhost" that includes PHP/SQL functionality?

    - by Jack
    I really do not want to install Linux at the moment, I would have to borrow a USB key, move files, format it, flash it, format it again, move files back, give it back... What would you recommend that is lightweight, easily and cleanly uninstallable afterwards (will install Linux when I get a new DVD-ROM, which will be in ~2 weeks), that also supports PHP and SQL? To be precise, I want to install a Wordpress blog, a few plugins, etc, and develop a theme. If there is no such thing for Windows (7, x64 if that matters), let me know too, I will borrow the USB key then (even though it's a pain).

    Read the article

  • How would i rank my keywords in Yahoo search engine?

    - by user1430715
    I am working as search engine optimizer team lead in a company and facing problem in a project which name is http://www.Prooftech.com.sg... Problem :- The Website has 10 keywords for which my client wanted the top 10 Ranking in Yahoo Singapore search engine. I have got top 10 ranking for the following 7 keywords Waterproofing, RC Roof ,Wall Leakages ,Ceiling Leakages , Water Leakages ,Roof Tile Coating ,Roof Tiles Repair in my 3 months work but still i am not getting the listing positions for Roof ,Concrete Repair ,Grouting .... I have Done lot of Bookmarking ,Blog Commenting ,Blog Creations ,Press Release,Classified Ads to get these 3 keywords in listing but there is no changes in the results.... Can any help me out from this problem so i can get Good rankings for Roof ,Concrete Repair ,Grouting

    Read the article

  • What percent of visitors should click on the next page before you enable prefetching?

    - by Kevin Burke
    Mozilla Firefox and Google Chrome support prefetching via an HTML tag: <!-- in chrome --> <link rel="prerender" href="http://example.org/index.html"> I suppose it is always worthwhile to include this tag if 100% of users on a page click on the "Next Page" button or similar, and never worthwhile to include it if only 2% or 3% of users visit the following page. At what percent of clicks should you turn on prefetching of the next page? 65%? Also, does the calculus change if the current page is HTTP and the next page is HTTPS?

    Read the article

  • Looking for recommendations for a server-side newsletter program

    - by Sparky672
    Hello- I'm currently using a server-side SQL based mailing list program called Php-List on multiple sites and it works fairly well. But installation and setup is quite cumbersome, quirky and the interface is not well organized... neither is the code... with pieces all over the place in random fashion. Customizing the "look & feel" and full site integration are both tedious and painful. Upgrading the version is made more complex since multiple edits need to be manually transferred each time. Also, probably due to a poor English translation, descriptions and instructions within certain areas of the user interface are contradictory and unclear. You just have to play with it and remember what you did last time it worked. It's supposed to be so my customers can send out their own newsletters... after supplying a written tutorial, about half of them seem to stumble through it okay and the other half just hire me to do it for them. So not quite easy enough for most average people to use. I'm looking for something that's as easy for them as using a blog or discussion forum. It also must be easier to set up and integrate into a site than Php-List. I have no problem getting dirty and writing CSS or HTML by hand. Nor do I have any problem editing the program code. Perhaps what I'm looking for is a solution that is more organized, a better GUI, and template or "skin" based. Therefore, if I spend many hours customizing a skin, I can simply update the program and re-use my custom skin without having to reproduce the tedious setup over and over. (I currently maintain a list of about 25 things I must manually edit or add to multiple files in multiple directories each time I install or upgrade Php-List) A great example of what I'm looking for is very much like WordPress or phpBB. They're both easy to install and customize yet powerful and packed full of features. They're also VERY well organized making customization less painful. So enough yammering for now... anyone know of something, besides Php-List, with many of the same features as Php-List; maintaining a mailing list with a server-side database, custom sign-up pages, automatic opt-in opt-out, allowing custom HTML newsletter templates, etc? Thank-you!

    Read the article

  • Recommend an open source CMS for single page web site

    - by RedMan
    Hi I want to create a single page web site like http://kiskolabs.com/ or http://www.carat.se to display my portfolio. I want to add new products after launching the site without having to edit the entire site. I've looked at opencart (too much for single page site), Magneto (more for ecommerce), Wordpress (couldn't find open source / free templates which i can start from). Can you suggest a CMS which will support the creation of a single page site and allow insertion of new products without having to edit the entire page? I would prefer a CMS which also has open source / free templates which I can tweak for my use. I can do php and mysql, xml. If it is an easier option I can do PSD to site (but don't know much about this at all).

    Read the article

  • RewriteRule working local but not on remote server

    - by m0tv
    I have a .htaccess file with one simple RewriteRule: RewriteEngine on RewriteRule ^([A-Za-z0-9-]+)$ ?site=$1 I want to have an url like http://www.example.com/imprint and forward it to http://www.example.com/?site=imprint I checked this rule with an RewriteRule tester which gave me the results I want to achieve. On my local development system it works well too. But on a remote server the URLs just give me a 404 error. Other more simple rewrite rules are working with no problems, so everything must be set up correctly (I think..). The problem is that I don't have access to any error logs or the server configs. So the only thing I can do is to guess... Can anyone tell me if theres something wrong with this rule? Or anything else I can do or test to solve this? Or has someone an idea what could be wrong on the server?

    Read the article

  • Site overthrown by Turkish hackers...

    - by Jackson Gariety
    Go ahead, laugh. I forgot to remove the default admin/admin account on my blog. SOmebody got in and has replaced my homepage with some internet graffiti. I've used .htaccess to replace the page with a 403 error, but no matter what I do, my wordpress homepage is this hacker thing. How can I setup my server so that ONLY MYSELF can view it while I'm fixing this via .htaccess? What steps should I take to eradicate them from my server? If I delete the ENTIRE website and change all the passwords, is he completely gone? Thanks.

    Read the article

  • htaccess url rewrite

    - by user761396
    i used to have a rewrite rule to RewriteRule ^([^/]+)\.htm$ index.php?c=$1 [NC] RewriteRule ^([^/]+)\.htm/([0-9.]+)$ index.php?c=$1&amt=$2 [NC] now, i have to change to RewriteRule ^1/([^/]+)\.htm$ index.php?c=$1 [NC] RewriteRule ^1/([^/]+)\.htm/([0-9.]+)$ index.php?c=$1&amt=$2 [NC] RewriteRule ^2/([^/]+)\.htm$ index2.php?c=$1 [NC] RewriteRule ^2/([^/]+)\.htm/([0-9.]+)$ index2.php?c=$1&amt=$2 [NC] (the difference is adding a subdirectory) my question is how to redirect my old ones to the 1/ subdirectory? thank you

    Read the article

  • Configure htaccess to show index.php as the default page instead of permissions error

    - by Jan De Laet
    Having a problem with my .htaccess. I have this to secure all my documents: Order Deny,Allow Deny from all Allow from 127.0.0.1 <FilesMatch "\.(htm|html|css|js|php)$"> Order Allow,Deny Allow from all Allow from 127.0.0.1 </FilesMatch> Now everything works fine except that the index page of www.mysite.com doesn't work and gives me the notification: You don't have permission to access / on this server. How can you fix this? If there stands www.example.com/index.php it works but if you surf to www.example.com I get this message.

    Read the article

  • How do I interpret direct traffic that lands on random pages?

    - by mfg
    Looking at yesterday, according to Google Analytics, I got six direct visitors to my site (their source/medium is direct/(none)). Only one ended up at the actual domain. The other five ended up at miscellaneous foo.com/xyz.html. I did not send out links to people by email, and I'm not sure how likely it is the people would have copy/pasted the URLs. How do the visitors end up there? Is there a way to better capture where they might be coming from?

    Read the article

  • Which folders need to be backed up for migration in Joomla?

    - by Devdatta Tengshe
    I'm helping someone update & migrate their old website, built on the Joomla framework. Currently it is running on Joomla 1.5.8 which is an ancient version. I've convinced them to upgrade Joomla to at least 2.5 I have already made a backup of the database. Most links I have seen talk of backing up the entire public_html folder (The website runs on a shared host). But in my fresh Joomla installation there are several folders that are in the public_html folder. So which of the folders in the public_html folder are from the content of the website, and which are of the old Joomla framework? I'm afraid that I might overwrite files of the new Joomla framework with the old framework, if copy all the files and folders into the new installation.

    Read the article

  • Github Feed affecting my WordPress installation? [on hold]

    - by saul
    Any idea how this fork is affecting my site? I went to verify my website log stats, and realized this may be the cause of a strange redirect constantly happening on my WordPress installation. Here's a line I found on my log: 54.81.91.95 - - [07/May/2014:22:52:08 -0400] "GET /category/selfie/feed/ HTTP/1.1" 200 1826 "-" "feedzirra http://github.com/pauldix/feedzirra/tree/master" And this is the Github fork (or however these are called). https://github.com/feedjira/feedjira/tree/master Basically, I think everytime I update my categories, (selfie in this case), I get redirected to install.php. Probably by triggering some GET function on that feed. to the best of my knowledge, this feed parses all url with this structure, blocking them, kind of like a DDoS attack?? Any ideas how to go about it??

    Read the article

  • Problem with missing JSON functions on PHP 5.2.6 / Plesk 8.4

    - by Drachenviech
    I have a vserver running openSuse 10.3, Apache 2 and Plesk 8.4. I can update/upgrade neither, as it is apparently not recommended to upgrade openSuse 10.3 (and an update to the EOL 10.4 does not seem to make much sense) and Plesk fails to update no matter what version I try (even fails to upgrade to 8.4.1). Still I can live with that somehow, primarily because I don’t have the time to do a fresh remote install on the vserver. What really is a problem is, that though the installed PHP is 5.2.6 it has no zip library and no json functions. The first is probably because PHP was not compiled with --enable-zip. The second is a big mystery though. As I understand it, it always comes with PHP unless its compiled with the --disable-json configure option. This is however not the case. And the json extension module is just not there. I even tried to enable it with extension=json.so with no luck either. the configure options of my PHP are (as shipped with Plesk 8.4) '../configure' '--prefix=/usr' '--datadir=/usr/share/php5' '--mandir=/usr/share/man' '--bindir=/usr/bin' '--with-libdir=lib' '--includedir=/usr/include' '--sysconfdir=/etc/php5/apache2' '--with-config-file-path=/etc/php5/apache2' '--with-config-file-scan-dir=/etc/php5/conf.d' '--enable-libxml' '--enable-session' '--with-mm' '--with-pcre-regex=/usr' '--enable-xml' '--enable-simplexml' '--enable-spl' '--enable-filter' '--disable-debug' '--enable-inline-optimization' '--disable-rpath' '--disable-static' '--enable-shared' '--program-suffix=5' '--with-pic' '--with-gnu-ld' '--with-system-tzdata=/usr/share/zoneinfo' '--with-apxs2=/usr/sbin/apxs2' '--disable-all' '--disable-cli' As I understand it, PECL is not an option with 5.2.6. Or am I mistaken? Even if I was not, the openSuse repository only goes as far as PHP 5.2.4. The openSuse install even came without zypper, which I had to manually install. So is there a way to get ziplib and json running in PHP 5.2.6 without having to recompile the binary?

    Read the article

  • PHP-FPM stops responding and dies [migrated]

    - by user12361
    I'm running Drupal 6 with Nginx 1.5.1 and PHP-FPM (PHP 5.3.26) on a 1GB single core VPS with 3GB of swap space on SSD storage. I just switched from shared hosting to this unmanaged VPS because my site was getting too heavy, so I'm still learning the ropes. I have moderately high traffic, I don't really monitor it closely but Google Adsense usually record close to 30K page views/day. I usually have 50 to 80 authenticated users logged in and a few hundred more anonymous users hitting the Boost static HTML cache at any given moment. The problem I'm having is that PHP-FPM frequently stops responding, resulting in Nginx 502 or 504 errors. I swear I have read every page on the internet about this issue, which seems fairly common, and I've tried endless combinations of configurations, and I can't find a good solution. After restarting Nginx and PHP-FPM, the site runs really fast for a while, and then without warning it simply stops responding. I get a white screen while the browser waits on the server, and after about 30 seconds to a minute it throws an Nginx 502 or 504 error. Sometimes it runs well for 2 minutes, sometimes 5 minutes, sometimes 5 hours, but it always ends up hanging. When I find the server in this state, there is still plenty of free memory (500MB or more) and no major CPU usage, the control and worker PHP-FPM processes are still present, and the server is still pingable and usable via SSH. A reload of PHP-FPM via the init script revives it again. The hangups don't seem to correspond to the amount of traffic, because I observed this behavior consistently when I was testing this configuration on a development VPS with no traffic at all. I've been constantly tweaking the settings, but I can't definitively eliminate the problem. I set Nginx workers to just 1. In the PHP-FPM config I have tried all three of the process managers. "Dynamic" is definitely the least reliable, consistently hanging up after only a few minutes. "Static" also has been unreliable and unpredictable. The least buggy has been "ondemand", but even that is failing me, sometimes after as much as 12 to 24 hours. But I can't leave the server unattended because PHP-FPM dies and never comes back on its own. I tried adjusting the pm.max_children value from as low as 3 to as high as 50, doesn't make a lot of difference, but I currently have it at 10. Same thing for the spare servers values. I also have set pm.max_requests anywhere from 30 to unlimited, and it doesn't seem to make a difference. According to the logs, the PHP-FPM processes are not exiting with SIGSEGV or SIGBUS, but rather with SIGTERM. I get a lot of lines like: WARNING: [pool www] child 3739, script '/var/www/drupal6/index.php' (request: "GET /index.php") execution timed out (38.739494 sec), terminating and: WARNING: [pool www] child 3738 exited on signal 15 (SIGTERM) after 50.004380 seconds from start I actually found several articles that recommend doing a graceful reload of PHP-FPM via cron every few minutes or hours to circumvent this issue. So that's what I did, "/etc/init.d/php-fpm reload" every 5 minutes. So far, it's keeping the lights on. But it feels like a dreadful hack. Is PHP-FPM really that unreliable? Is there anything else I can do? Thanks a lot!

    Read the article

  • Do extra words in url affect SEO?

    - by smp7d
    Often for technical reasons we end up with some extra words in a url that we would not want to optimize for as they would have no bearing on the content. Examples would be: sportssite.com/content/sports-article movieportal.com/node/movie-review electronicsforum.com/blog/top-10-cameras webmasters.stackexchange.com/questions/34046/do-extra-words-in-url-affect-seo Do these have any affect on ranking in any of the major search engines? Would it behoove us to strip the extra words?

    Read the article

  • Should I have link rel=next & prev on URLs which have query variables?

    - by user21100
    For example, I have link rel prev & next set up on these pages of products: site.com?page=2 site.com?page=3 (this is my preferred structure by the way and I'm trying to get all the ugly URLs which are littered with query variables deindexed as they are causing duplicate content). So the above URLs are fine but once a filter to narrow product results is selected, like "price", the URL shows like this: site.com?price[1000-1499]=on site.com?page=2&price[1000-1499]=on As of right now, I am having the link rel prev & next dynamically added to the header of these pages but since I am working on getting these query variable URLs pages deindexed, I am wondering if I should get rid of it on these pages? Any thoughts?

    Read the article

  • Adding your website to free web directories as a link building strategy

    - by Man
    It's been two month I've launched a website. I recently ran into some websites which list directory of other websites. Some examples can be http://www.addgoogleurl.com/ and webdirectorieslist.com, etc. I was talking with my colleague and he says, adding the URL of my website to these kind of websites will negate the effect of other organic real links. Does google consider positive/negative points for these kind of links from web directory websites? Do you have any source for your answer to refer to? I found this question asked before on webmasters.SE, this is asking about many links from a website.

    Read the article

  • If an visitors IP address contains "google" or a similar keyword, does this mean they were a crawler?

    - by Roscoe
    Hi, I have a huge list of IP addresses recorded from various visitors to a website. A huge amount of the visitors, in some months over 70%, came from IP addresses that contained keywords such as google, yahoo, bot, crawler, etc. Does this mean that those users were infact search engine crawlers? If so, why are their so many crawlers in my visitor records in comparison to genuine human visitors? (and if not what's the explanation?) Thanks in advance.

    Read the article

< Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >