Search Results

Search found 9722 results on 389 pages for 'dvc pro'.

Page 173/389 | < Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >

  • Parse text file on click - and then display

    - by John R
    I am thinking of a methodology for rapid retrieval of code snippets. I imagine an HTML table with a setup like this: one two ... ten one oneTwo() oneTen() two twoOne() twoTen() ... ten tenOne() tenTwo() When a user clicks a function in this HTML table, a snippet of code is shown in another div tag or perhaps a popup window (I'm open to different solutions). I want to maintain only one PHP file named utitlities.php that contains a class called 'util'. This file & class will hold all the functions referenced in the above table (it is also used on various projects and is functional code). A key idea is that I do not want to update the HTML documentation everytime I write/update a new function in utilities.php. I should be able to click a function in the table and have PHP open the utilities file, parse out the apropriate function and display it in an HTML window. Questions: 1) I will be coding this in PHP and JavaScript but am wondering if similar scripts are available (for all or part) so I don't reinvent the wheel. 2) Quick & easy Ajax suggestions appreciated too (probably will use jquery, but am rusty). 3) Methodology for parsing out the functions from the utilities.php file (I'm not to good with regex).

    Read the article

  • guest blogging, recipricle links & nofollow

    - by sam
    When writing a guest blog for a site and in the blog post i write i link back to my self, that counts as an imbound link. Writing something on my blog, like "have a look at this post i wrote for _" made that a link the links would be recipricle, correct ? thus cancling each other out.. If i was to to make the link back to my article a nofollow link then would i still get the link juice ? If i write guest blog post and the site want to also write a guest blog on my site later on whats the best way to handle it as wont these both cancel each other out and have no effect ?

    Read the article

  • Free domains for site names?

    - by Prix
    I am going to create a forum for my clan from a game and I was looking for a way to point/redirect to an ugly/long domain name from a shorter domain name. I am looking for a provider similar to http://www.freedomain.co.nr/ but I am looking for more options and different domain extensions to decide which one would be easiest to use. So it would be something like www.niceclanname.domainext - www.unglyhardnametouseandremember.com/clan/ What are the available services that do this?

    Read the article

  • Google Analytics HTTP vs HTTPS

    - by Pelangi
    I want to use Google Analytics on a website that uses both HTTP and HTTPS that works as explained below: Secure pages accessed through https://mydomain.com/secure/* are always on HTTPS. Any access to these pages through HTTP will be redirected to HTTPS. Any other pages will be accessible through both HTTP and HTTPS I have a Google Analytics profile with URL using HTTPS. Will I cover all traffic? Do I need to create another profile using HTTP and how should I apply the other profile?

    Read the article

  • Storing User-uploaded Images

    - by Nyxynyx
    What is the usual practice for handling user uploaded photos and storing them on the database and server? For a user profile image: After receiving the image file from user, rename file to <image_id>_<username> Move image to /images/userprofile Add img filename to a table users containing their profile details like first_name, last_name, age, gender, birthday For a image for a review done by user: After receiving the image file from user, rename file to <image_id>_<review_id> Move image to /images/reviews Add img filename to a table reviews containing their profile details like review_id, review_content, user_id, score. Question 1: How should I go about storing the image filenames if the user can upload multiple photos for a particular review? Serialize? Question 2: Or have another table review_images with columns review_id, image_id, image_filename just for tracking images? Will doing a JOIN when retriving the image_filename from this table slow down performance noticeably? Question 3: Should all the images be stored in a single folder? Will there be a problem when we have 100K photos in the same folder? Is there a more efficient way to go about doing this?

    Read the article

  • Can I improve my AdWords quality scores with better landing pages?

    - by Eric
    I noticed that I have some keywords in my AdWords that are totally applicable to my site but the quality score of the keyword is 4 or 5. I'd like to get it up higher by creating custom versions of my site's home page (landing page) targeted specifically for people searching on those keywords. So for example, if we pretend my site sells pet food, my current home page has the phrase "dog food." I have a specific AdWords campaign for people searching on cat food (with cat food-specific ads). I'm thinking about changing the URL on those ads to something like http://mysite.com/cat.html, so a different home page comes up with the phrase "cat food." My thinking is that will help Google see that this new landing page is appropriate for the keywords and will raise my quality score for the "cat food" keywords. (Note that none of what I'm doing is shady or misleading; nobody would disagree that all of the keywords and ads I've created are perfect and appropriate for what my site offers.) Question: is what I describe the correct way to raise poor quality scores on keywords, and will it help?

    Read the article

  • Bookmark login_email at new PayPal URL [closed]

    - by Jonna Stevens
    I have used this Bookmark in Firefox so that my email would be autofilled and I only had to write in my password. PayPal has recently changed its login URL. Has anybody figured out a method to achieve this with the new URl ? Old URL: https://www.paypal.com/es/cgi-bin/webscr?cmd=_login-run&login_email=myemail%40myemail.com New URL (not working): https://www.paypal.com/es/webapps/mpp/home-merchant?login_email=myemail%40myemail.com

    Read the article

  • Multi user issue in Drupal 7

    - by sachin
    I am trying to create a web site in which there are different link for each user. Like if my site address is example.com and there are three users: u1,u2,u3... If any of them is logged in, then the site should redirect to example.com/u1 and if he create any link or block on this url, this link or block should not visible for remaining two links. Also all of them should have different admin pannel.

    Read the article

  • Hosting media on separate server than web server

    - by user18832
    Basically I have a website hosted by a web hosting company which I have limited access to (ftp upload etc). I have a home server which I use to record and store audio files. Is there an elegant way or best practice to host a page on the webserver which links to the audio files? I'm considering hosting a page on the home server and redirecting to that from the web server, or setting up something like rsync to push the audio files to the web server - I'm just not certain which solution would be best.

    Read the article

  • What is a good network for full-page rich ads?

    - by Vishnu
    I'm currently developing a website where users will be able to upload content. I would like to be able to show a full-page ad whenever someone tries to view the content. The ad should take up most of the screen, and I should be able to have a "continue to the content --" link at the top. Preferably, I want something like what is currently on Forbes (if you haven't seen it, here: http://www.forbes.com/fdc/welcome.shtml but with an ad in the black area). Of course, the most revenue is the best. Thanks.

    Read the article

  • Google won't display site

    - by Markasoftware
    My website (markasoftware.getenjoyment.net) doesn't seem to be indexed properly by Google (I haven't tried other search engines). When I type in the URL of my site it appears right at the top of the list like it should. When I type in the entire contents of the title, however, the site doesn't appear! The title is quite long (Thermonuclear War Game Online: Thermonuclear War By Mark) and it has little (if any) competition. Have I been punished by Google for some reason, or is it something else? I have received zero hits from search engines. Can someone tell me why my site down't appear?

    Read the article

  • Allowed keywords for adwords [closed]

    - by Tom Gullen
    Possible Duplicate: Ok to target product names in adwords? I've replaced relevant words away from the real world without losing semantics which is a little difficult. A competitor is called "Box Maker" Can I target the keywords: "box maker" if my company is selling a tool to help you make boxes? Or is that disallowed? Would "Box Maker" the company be able to file a complain on Google? Would it go anywhere? The term 'box maker' gets a lot of searches and is an incredibly cost effective search to target

    Read the article

  • SSL certificates - best bang for your buck [closed]

    - by Dunnie
    I am in the process of setting up an online store. This is the first project I have attempted which will require a good level of security, so I recognise that an decent SSL certificate is a must. My current (albeit admittedly basic) understanding of the options are: DV SSL - more or less pointless, as provides no verification. OV SSL - better, as provides a basic level of organisational verification. EV SSL - 'better, full' verification, but much more expensive. As this is a new business venture, and budgets are tight, which option provides the best bang for my buck? I have read questions such as EV SSL Certificates - does anyone care? which suggest that EV certificates are a bit of a con. Does this mean that OV certificates offer better value for money, especially for new businesses with shallow pockets?

    Read the article

  • http requests, using sprites and file sizes -

    - by crazy sarah
    Hi all I'm in the process of finding out all about sprites and how they can speed up your pages. So I've used spriteMe to create a overall sprite image which is 130kb, this is made up of 14 images with a combined total size of about 65kb So is it better to have one http request and a file size of 130kb or 14 requests for a total of 65kb? Also there is a detailed image which has been put into the spite which caused it's size to go up by about 60kb odd, this used to be a seperate jpg image which was only 30kb. Would I be better off having it seperate and suffering the additional request?

    Read the article

  • Sitemap.xml generator

    - by miller55
    I need some sitemap generator that support unlimited pages. Also is there some generator that support that i remove some pages from crowing. For example i have some pages like reviews pages which are generate automatically when someone add review. They all look like: mysite.com/review_1.hml mysite.com/review_2.hml and so on ...... so i dont want to sitemap generate those pages. Thanks in advance!

    Read the article

  • Go up one directory in mod_rewrite

    - by Rudolph Gottesheim
    I've got a standard Zend Framework 1 project that looks a bit like this: Project |- public |- .htaccess |- index.php The .htaccess looks like this: RewriteEngine On RewriteBase / RewriteRule ^image/.*$ img.php?file=$1 [NC,L] RewriteCond %{REQUEST_FILENAME} -s [OR] RewriteCond %{REQUEST_FILENAME} -l [OR] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^.*$ - [NC,L] RewriteRule ^.*$ index.php [NC,L] Now I want to start transitioning the site to Zend Framework 2, which I put in a separate directory in the root, so the whole thing looks like this: Project |- public |- .htaccess |- index.php |- zf2 |- public |- .htaccess |- index.php What would I have to change in my original (ZF1) .htaccess to route all requests to (for example) /zf2/whatever to ZF2's index.php? I've tried RewriteRule ^zf2(/.*)$ ../zf2/public/index.php [NC,L] in the line after RewriteBase /, but that just gives me a 400 Bad Request.

    Read the article

  • Using Google Webmaster & Analytics, what data to look at to improve website performance?

    - by Rob
    Using data from Google Analytics and Webmaster tools, what data should I be looking at to improve my websites performance? I want to improve the SEO, usability and just general performance of my website. EDIT: It's a portfolio website that we've done the initial SEO for, also optimised all images etc and made the site as fast as possible. What kind of things should I be looking out for in the analytics and webmaster data to improve performance for both the SEO and each individual page.

    Read the article

  • How to tell google a blog article has been updated?

    - by Scott
    The URL of my posts has the publication date and slugged title, but how can I best show google search users that an article has been updated since its original publication? I was considering devoting a few characters of the meta description (e.g. "updated 2013-Aug-1), or doing so under the first h1 tag. I don't want to hurt the seo value of my site, but I also want to let users know that articles have been substantially updated since their publication. Is there a better way to do this?

    Read the article

  • Is it safe to Block These URLs with Robots.txt?

    - by Edgar Quintero
    I have a website that has all URLs optimized and 301 redirected from nasty URLs to clean ones. However, everywhere throughout the site the unclean URLs are linked in menus, content, products, etc. Google currently has all clean URLs indexed, along with a few unclean URLs too. So the site still has linked everywhere the old URLs (ideally this wouldn't be the case but this is how it is ATM). I would like to block the unclean URLs with robots.txt. The question: If I block these unclean URLs with the robots.txt, when the entire website is linked with them (but they all redirect to the clean version), will this affect the indexing status at all?

    Read the article

  • Trouble with .htacess redirection

    - by mike23
    I use this redirect rule to redirect users from www.domain.com/admin to www.domain.com/wp-admin on a Wordpress site. RedirectMatch 301 \@admin http://www.domain.com/wp-admin The problem is that instead redirecting to wp-admin/, it redirects to an article called Administrators are awesome people (slug : administrators-are-awesome-people) I can guess what is going on, WP sees that there is an article slug starting with "admin", and redirects to it, overruling my own rule. Is there a way to be more specific, like saying "redirect urls that end with exactly admin ?

    Read the article

  • wget not respecting my robots.txt. Is there an interceptor?

    - by Jane Wilkie
    I have a website where I post csv files as a free service. Recently I have noticed that wget and libwww have been scraping pretty hard and I was wondering how to circumvent that even if only a little. I have implemented a robots.txt policy. I posted it below.. User-agent: wget Disallow: / User-agent: libwww Disallow: / User-agent: * Disallow: / Issuing a wget from my totally independent ubuntu box shows that wget against my server just doesn't seem to work like so.... http://myserver.com/file.csv Anyway I don't mind people just grabbing the info, I just want to implement some sort of flood control, like a wrapper or an interceptor. Does anyone have a thought about this or could point me in the direction of a resource. I realize that it might not even be possible. Just after some ideas. Janie

    Read the article

  • Yahoo media player not working with Ruby on rails

    - by luca590
    I have a yahoo media player embedded in my webpage. I am currently using Ruby on Rails to create/edit my web page. When i click the play button next to a track the YMP waits a while and then goes to the next track without playing the first one. I then get a warning on my second (last) track that its file could not be found. Does anyone has a better recommendation for an audio player or a way to fix this one?

    Read the article

  • Poor backlink profile - search rankings not updated for 2+ months

    - by fistameeny
    I am carrying out some work on a website that is a PR2 with a few good quality, relevant backlinks (PR4-6). It has a presence on Twitter that is updated regularly, a Google Places listing, and listings on some decent directories (Qype etc). The site was rebuilt into Drupal 7 two months ago, with all the basics done - URL rewriting, XML Sitemap submitted to Google, and most importantly, good quality, structured content. I've noticed that Google is still showing "old" URL's from the previous version of the site that was ditched 8 weeks ago. I think the site may be penalised under the Penguin update, as a previous SEO company created many low quality links from link farms/directories. My question is what the correct way to deal with this is. Bing Webmaster Tools can "disavow" links, and I guess I can attempt to contact the link farms to have them removed. I've already submitted a request to Google to request that we have the penalty removed as we're trying to tidy up a bad history. We submit updated sitemaps to Google and Bing daily, and have built some further decent quality, relevant links. Is there anything further I can do?

    Read the article

  • How can I redirect everything but the index as 410?

    - by Mikko Saari
    Our site shut down and we need to give a 410 redirect to the users. We have a small one-page replacement site set up in the same domain and a custom 410 error page. We'd like to have it so that all page views are responded with 410 and redirected to the error page, except for the front page, which should point to the new index.html. Here's what in the .htaccess: RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteRule !^index\.html$ index.html [L,R=410] This works, except for one thing: If I type the domain name, I get the 410 page. With www.example.com/index.html I see the index page as I should, but just www.example.com gets 410. How could I fix this?

    Read the article

  • Which mobile device is appropriate as a utility tool for a web master?

    - by Kayle
    Basically, I'm looking for a device to use on the road and I would prefer to not have to sit down for the majority of the tasks (which rules out netbooks, in my mind). I'm also hoping to spend less than $500. This is what I'd like to "capably" be able to do on the device: Browse the web in non-mobile format, flash is a plus Email, chat, etc Have access to a decent text editor and ftp OR a browser that supports BESPIN/ACE Some sort of SSH support I'm looking at rooted Android phones and iPhone/iPads... though the phone aspect is only icing (it would be cool to consolidate the two devices and have net access through cell networks, but I'm not married to the idea). Are there cheap linux tablets that are ready for prime-time yet? I suppose that would be ideal. All suggestions welcome!

    Read the article

< Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >