Search Results

Search found 9722 results on 389 pages for 'dvc pro'.

Page 174/389 | < Previous Page | 170 171 172 173 174 175 176 177 178 179 180 181  | Next Page >

  • Joomla: Cross site link boxes

    - by Dean Smith
    I am currently evaluating Joomla for use on the the rebuild of our corporate website. One of the features our designs have is spaces for what we've been calling widgets. These widgets include common ones for each page, contact us boxes, section navigation, that all seem pretty easy to implement. The other widgets we would us to highlight specific pages within the site relevant to the one you are on, and are therefore different for each article. We've called these link widgets and these are a pretty common site on many websites. I'm honestly at a bit of a loss as to how to do this with Joomla. From a CMS perspective I'd like to allow users to just select which three link widgets they want when editing the article and have some way of creating the widgets within the CMS as well. Editing the link widget would allow setting a title, a bit of text and selecting the page that it links to. Am I going to be able to do this with Joomla, and if I can roughly how to I go about it ?

    Read the article

  • Is it safe to Block These URLs with Robots.txt?

    - by Edgar Quintero
    I have a website that has all URLs optimized and 301 redirected from nasty URLs to clean ones. However, everywhere throughout the site the unclean URLs are linked in menus, content, products, etc. Google currently has all clean URLs indexed, along with a few unclean URLs too. So the site still has linked everywhere the old URLs (ideally this wouldn't be the case but this is how it is ATM). I would like to block the unclean URLs with robots.txt. The question: If I block these unclean URLs with the robots.txt, when the entire website is linked with them (but they all redirect to the clean version), will this affect the indexing status at all?

    Read the article

  • Experienced programmer, beginner at web design, tools for effective maintainable web design? [closed]

    - by Clinton
    I do quite a bit of programming in my work, which I'm comfortable with, but recently I've being trying to do some web-design for non-work related reasons. I've got a Drupal site up and running, and added some content. But they all look fairly basic. Header with some content. It doesn't look particularly polished. Anyway, as an example, what I wanted to do was make some "bubbles", each with some text in them. From a programmers point of view, say: bubble(question_text, answer_text) might expand to a box with some border, with "Question: " + question_text then "Answer: " + answer_text. Of course I'd have lots of these bubbles, but I'd like to change their look and feel in one place, so simple HTML would be a maintainable nightmare. I also want to lay them out on the screen in some fashion. I was thinking a mixture of javascript and CSS, or possibly use PHP which Drupal uses. On the other hand, I fear I might be taking a 1990s approach to this, and that there's actually tools available now that make this process a lot easier. I'm just wondering what the best approach to this sort of task is? Should I be using offline web design software and copying the code to Drupal, and if so, any recommendations? I'm sorry if my question is a bit vague, because I'm not really sure what question I should be asking. I'd appreciate if you answer and comment, and I'll try my best to be more specific as I understand more.

    Read the article

  • Is there any advantage/disadvantage to using robots.txt to disallow access to legal pages such as terms, privacy policy, etc.?

    - by CaptainCodeman
    As I understand, having repetitive content is a detriment to search engine placement. Given that many websites that use similar or even identical "Terms and Conditions" and "Privacy Policy" pages due to similar legal wording or due to copy & pasting from the same source, would it be a good idea to disallow access to these pages via robots.txt, in order to avoid being penalized for "non-original content"? Or, on the contrary, could the search engines identify this as circumvention and penalize the site for trying to hide content? Or does it not matter?

    Read the article

  • How can I edit (in MS Expression Web ) FrontPage Site Parameters (Substitutions)?

    - by Clay Nichols
    Or, asked another way: Where are the values for MS Front Page Substitions (Site Parameters) stored? (so that I can edit in Web Expressions) Background I'm ashamed to admit that I've been maintaining our company's website in MS FrontPage for over 9 years. I'm moving it to Expression Web, which will display the Substitutions (stored as Site Parameters) but I can't figure out where to edit them. I tried searching the source folders for the website (on my development PC) for the name of the parameter (s-Variable=hoursOfOperation) but did not find it (other than in the files it was actually used in.

    Read the article

  • joomla ACL :: two groups permissions conflict

    - by semyon
    I have several user groups on my website, like: Site Staff Departments -- History department -- Physics department -- Foreign languages department -- IT department etc I also have several categories, like: News About ... Departments -- History department -- Physics department -- Foreign languages department -- IT department etc Users in Site Staff group can edit entire site, except for Departments categories (I've set Deny permission for it). Each Department user group can edit only its corresponding category. I have successfully implemented all this. The question is: If a user belongs to two groups (Site Staff and Physics department - for instance) - he should be able to edit the whole site, except for Departments category. And also he should be able to edit Physics department category - this is what I cannot implement. Can you suggest any ideas?

    Read the article

  • Rewrite rule to show as directory using .htaccess

    - by chanchal1987
    I want to implement a rewrite rule in my .htaccess file to show a specific url as a directory of my server. See the code below I written, RewriteRule ^(.*)/$ ?page=$1 [NC] This will rewrites urls like www.mysite.com/abc/ to www.mysite.com/index.php?page=abc. But if I request www.mysite.com/abc then it is throwing an 404 error. How can I write a rewrite rule which will match www.mysite.com/abc and www.mysite.com/abc/ both? Edit: My current .htaccess file (After Litso's answer's 3rd revision) is like below: ## ErrorDocument 401 /index.php?error=401 ErrorDocument 400 /index.php?error=400 ErrorDocument 403 /index.php?error=403 ErrorDocument 500 /index.php?error=500 ErrorDocument 404 /index.php?error=404 DirectoryIndex index.htm index.html index.php RewriteEngine on RewriteBase / Options +FollowSymlinks RewriteRule ^(.+)\.html?$ $1.php RewriteCond !-d RewriteRule ^(.*)/$ ?page=$1 [NC,L] RewriteCond %{REQUEST_URI} !index.php RewriteRule ^(.*)$ ?page=$1 [NC,L] ##

    Read the article

  • Alexa indexing browsing history?

    - by Haluk
    We have this test.php sitting around in a forgotten folder. It is a script which just sends an email to our site admin. We never had a page linking to it. It is not indexed by Google. It does not exist in the Internet Archive Wayback Machine. But every now and then it gets crawled by ia_archiver. I wonder how it got indexed. Could it be because of the Alexa toolbar installed on our computer? Does Alexa index our personal browsing history?

    Read the article

  • Moving from a static site to a CMS with new URLs and meta-data for pages

    - by Chris J
    Hi I am in the process of rebuilding a site from static pages to a CMS which will be using mod_rewrite to generate new page URLs. In this process our marketing people and myself have decided to tidy up the descriptions, keywords and titles. Eg: a page which who's URL is currently "website-name/about_us.html" and has a title of "website-name - something not quite page specific" will change to "website-name/about-us/" and title: "about us - website-name" and may have a few keywords and the description changed. Our goal with updating the meta data is to improve our page rankings and try to keep in line with some best practices for SEO. Though our current page rankings are quite good in many aspects, there is room for improvement. All of the pages will also have content changes (like rearranging heading tags, new menu on all pages, new content in footer, extra pieces of dynamic content relating to other pages). In this new site process I plan to use 301 redirects for all the old URLs pointing to the new URLs. My question is what can I expect to happen to the page rankings in Google, in the sort term and long term? Will this be like kicking off a new site which will have to build up trust over time or will the original page rankings have affect?

    Read the article

  • Development Pipeline / Phases

    - by Chris
    Hey All, Im looking for a bit of advice ... I have been developing websites for quite sometime now, and i have now come to the stage where i want to run things properly, i am trying to put together a proper workflow for my projects. i have come up with the following and would love any feedback or additions i havent added. Discovery and Research Information Architecture Interaction Design Visual Design Site Development Quality Assurance Launch, Wine and Cheese Cheers,

    Read the article

  • Google is not treating two Australian schools as separate sites when both are subdomains of qld.edu.au

    - by LuckySpoon
    My question relates to two websites, each of which is a "Calvary Christian College", however in two totally different locations and unrelated to each other entirely (except by name, and thus domain). All schools in the state are issued a <school-name>.qld.edu.au subdomain, in this case calvary.qld.edu.au and calvarycc.qld.edu.au. Now what's interesting is that these domains are crossing each other in sitelinks for searches such as calvary christian college townsville. The green data here is for one school (the Townsville school, as per search term), and the red data is for the other school. I've put a demotion in for this 6 months ago (we control calvary.qld.edu.au), however we're seeing no change on the results page. I have been able to get the owners of calvarycc.qld.edu.au to submit demotions for our domain, which should go in sometime in the next few days. What can we do to tell Google that these websites are not interchangeable, despite both appearing as "subdomains" of qld.edu.au? We can possibly open channels of communication with the administrators of qld.edu.au but will need to tell them what we need to change, and at this point I'm out of ideas.

    Read the article

  • Redirect public traffic to a different subfolder, while local traffic remains unchanged

    - by ecnepsnai
    I would like to have local (intranet) HTTP traffic go to the /var/www/html folder while any public traffic goes to the subfolder, /var/www/html/public I've tried this configuration, with some variation, in httpd.conf <VirtualHost PRIVATE-IP> DocumentRoot /var/www/html ServerName ecn ErrorLog /var/www/logs/error/private CustomLog /var/www/logs/access/private common </VirtualHost> <VirtualHost PUBLIC-IP> DocumentRoot /var/www/html/public ServerName PUBLIC-DOMAIN-NAME ErrorLog /var/www/logs/error/public CustomLog /var/www/logs/access/public common </VirtualHost> PUBLIC-IP, PRIVATE-IP, and PUBLIC-DOMAIN name are all replaced with the correct values in the actual document. The problem is, local traffic can browse fine but remote traffic is directed to the root folder and getting 403d (because I have that folder blocked off through my .htaccess file). If I append /public to the URL it works fine.

    Read the article

  • Object-based content management system

    - by Adam Maras
    I remember hearing within the last year or two about a content management system either being released or developed that was centralized around product/item information. I'm aware that there are several CMSes that have this capability, but this particular one was built specifically for that task. Also, I remember it winning some sort of award or recognition for upcoming software products. However, I can't for the life of me remember what this CMS was called or who was developing it. Does anyone know what package I'm talking about?

    Read the article

  • Using CSS3 is a bad practice? [closed]

    - by Qmal
    Possible Duplicate: Should I use HTML5 and/or CSS3 to build my website? I just want to know if it's considered as a "bad practice" to use things like rounded corners, gradients and so on... I understand that there are bots and crawlers that do not process CSS, but they don't need to. And nowadays most people use browsers that can process CSS3 with no problem. So should I make my buttons and shadows and such look pretty with CSS3 or with images?

    Read the article

  • In Linux, which tools are free to use to make Web site mockups?

    - by user11173
    I am using Ubuntu/Fedora. Which available mock-up builders i can use before making a website? Follow up: Adobe AIR for Linux is no longer supported. To access older, unsupported versions, please read the AIR archive. Different operating system? Downloaded: http://www.balsamiq.com/download Direct Links Mockups for Desktop: Cross-Platform: MockupsForDesktop.air Windows: MockupsForDesktop.exe Mac OSX: MockupsForDesktop.dmg Linux 32bit: MockupsForDesktop32bit.deb Linux 64bit: MockupsForDesktop64bit.deb Windows with Adobe Air bundled: MockupsForDesktopInstallerWin.zip (for offline installations).

    Read the article

  • Why there suddenly were so many 400 request in my access log?

    - by LotusH
    Below are little part of my access_log 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 05 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 06 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 07 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 08 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 09 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 10 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 11 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 12 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 13 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 14 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" And the volume was very huge, some like one hundred thousand of these 400 request per second. And I'm pretty sure there are no errors on my site in that period of time.(No error report and I didn't change the source code)

    Read the article

  • google chrome : http authentication issue on iframe

    - by Daniel Dzussa
    I have an HTML file with 2 links http://versionplus.in/pass/new.html, both links are in an iframe to load the contents inside the iframe. I have two protected directories one on the same server and other on another server. If you click on either link it will popup the login box, same for both links in all browsers except google chorme. google chrome doesn't show login box for the protected folder on another server, how can I fix this ?

    Read the article

  • How to make Google recognize language for a multilingual website?

    - by Julien Fouilhé
    Few weeks ago, I implemented translation functionality for the website of my company. The website is now available in french and english and I did look on the internet the best way to do if we want to do not lose any ranking and to have our pages on Google. Here is what I did: I did set a response header: Content-Language:en and Content-Language:fr My URLs are formatted as: http://www.website.com/en/... and http://www.website.com/fr/... My html tag is set with a lang attribute: <html lang="en"> and <html lang="fr"> There is a <link rel="alternate" hreflang="en" href="EnglishPageUrl"> on french pages and a <link rel="alternate" hreflang="en" href="frenchPageUrl"> on english pages. But Google keeps referring to some english pages when I'm doing a search on french engine, knowing that the website was first only available in english. Is that normal? Do I have to wait still, it has been now almost one month, I thought it would be okay...? Thank you.

    Read the article

  • wget not respecting my robots.txt. Is there an interceptor?

    - by Jane Wilkie
    I have a website where I post csv files as a free service. Recently I have noticed that wget and libwww have been scraping pretty hard and I was wondering how to circumvent that even if only a little. I have implemented a robots.txt policy. I posted it below.. User-agent: wget Disallow: / User-agent: libwww Disallow: / User-agent: * Disallow: / Issuing a wget from my totally independent ubuntu box shows that wget against my server just doesn't seem to work like so.... http://myserver.com/file.csv Anyway I don't mind people just grabbing the info, I just want to implement some sort of flood control, like a wrapper or an interceptor. Does anyone have a thought about this or could point me in the direction of a resource. I realize that it might not even be possible. Just after some ideas. Janie

    Read the article

  • Alternatives to Marin Software for ppc management? [closed]

    - by Skyao
    Does anyone have suggestion for ppc management tool similar to Marin Software but is much cheaper? Marin Software Enterprise charges a minimum of several thousand dollars per month. The functionality needed is as follows: Keyword creation and management - Campaign Management Automated bidding and roi tools - Reporting and analytics Ability to upload/download customized revenue data any suggestions would be appreciated..thanks

    Read the article

  • Woolrich Prezzi che sono perfetti per tutte le donne

    - by WoolrichParka
    Gli strati Parka Woolrich Prezzi Woolrich sono realizzati con il 100% verso il basso e con la miscela remove ragionevole sull'area rivestimento esterno che tiene i siti per più innovativi clienti.L Woolrich Arctic strati sono sviluppati con la tonalità tradizionale che può essere anche in aggiunta al cappuccio Woolrich Parka e creare femmine in reasonable.Now prospettiva elegante, si mostra una vasta gamma Woolrich Outlet Bologna di disegni di Woolrich Parka Men per la vostra scelta, che sono tutti un valore di acquisto nel design e prezzo.wufengfengmaple36

    Read the article

  • Trouble with .htacess redirection

    - by mike23
    I use this redirect rule to redirect users from www.domain.com/admin to www.domain.com/wp-admin on a Wordpress site. RedirectMatch 301 \@admin http://www.domain.com/wp-admin The problem is that instead redirecting to wp-admin/, it redirects to an article called Administrators are awesome people (slug : administrators-are-awesome-people) I can guess what is going on, WP sees that there is an article slug starting with "admin", and redirects to it, overruling my own rule. Is there a way to be more specific, like saying "redirect urls that end with exactly admin ?

    Read the article

  • Web stalker has purchased a domain name that uses my personal name, web page is defamatory [closed]

    - by Deborah Morse-Kahn
    We have been unsuccessful in persuading a stalker's website host to release the domain name he purchased which is my own personal name, e.g., PERSONALNAME.com. You will find my name below in the signature area. Look for yourself. On the one page that this domain name leades to is dreadful and defamatory material. No attorney has felt it worth their time to chase this issue down, and we cannot afford to go to a national or international dispute resolution group to bring this issue to WHOIS. Worse, the stalker is amoral and a psychopath: he would just love the attention. We've even consider trying to find someone to illegally hack into the webpage to at least redirect the domain pointers to my own professional website. This issue has continued now for two years and is affecting my professional reputations as potential clients have looked for me online. Is there any remedy? Your help and advice would be greatly welcomed.

    Read the article

  • SMTP server to deliver mail to Rails app, how?

    - by Gunchars
    all, this is my first question and I hope I chose the right place to post it. Here's what I need help with: I've been looking for this all day and I'm having a hard time finding a SMTP mail server that would fit the following criteria: lightweight, does one thing and does it good is able to route and deliver local mail to a Rails application The second point could be accomplished in any number of ways. I'm running a VPS, so I have full freedom in how to implement this. It could, for example, put messages straight in the db, pipe them to a helper program that would then process them accordingly or also save messages in a mbox file and run a script after every received message. I'm building a small site so the traffic is not going to be a problem. If there are alternative ways to deliver messages to a Rails app, I'd gladly hear about them. Thank you. EDIT: After long searching, I think I've found what I was looking for. Exim is a mail server that can deliver local mail to pipes. Also, Rails 3 and ActionMailer can make it really easy to process the incoming mail. More info here: http://www.exim.org/exim-html-current/doc/html/spec_html/ch29.html http://guides.rubyonrails.org/action_mailer_basics.html#receiving-emails

    Read the article

  • Free online PHP hosting [closed]

    - by Anthony Newman
    Possible Duplicate: How to find web hosting that meets my requirements? I have a PHP script that can take $_GET parameters from a URL (i.e. http://www.example.com/test.php?name=george). I'd like to be able to host this script online so that others can pass parameters to it to obtain the returned data. Anyone know of a free PHP hosting site that would allow for his functionality? (PS: I can't host it myself) Thanks!

    Read the article

< Previous Page | 170 171 172 173 174 175 176 177 178 179 180 181  | Next Page >