Search Results

Search found 18126 results on 726 pages for 'core pro'.

Page 263/726 | < Previous Page | 259 260 261 262 263 264 265 266 267 268 269 270  | Next Page >

  • Experienced programmer, beginner at web design, tools for effective maintainable web design? [closed]

    - by Clinton
    I do quite a bit of programming in my work, which I'm comfortable with, but recently I've being trying to do some web-design for non-work related reasons. I've got a Drupal site up and running, and added some content. But they all look fairly basic. Header with some content. It doesn't look particularly polished. Anyway, as an example, what I wanted to do was make some "bubbles", each with some text in them. From a programmers point of view, say: bubble(question_text, answer_text) might expand to a box with some border, with "Question: " + question_text then "Answer: " + answer_text. Of course I'd have lots of these bubbles, but I'd like to change their look and feel in one place, so simple HTML would be a maintainable nightmare. I also want to lay them out on the screen in some fashion. I was thinking a mixture of javascript and CSS, or possibly use PHP which Drupal uses. On the other hand, I fear I might be taking a 1990s approach to this, and that there's actually tools available now that make this process a lot easier. I'm just wondering what the best approach to this sort of task is? Should I be using offline web design software and copying the code to Drupal, and if so, any recommendations? I'm sorry if my question is a bit vague, because I'm not really sure what question I should be asking. I'd appreciate if you answer and comment, and I'll try my best to be more specific as I understand more.

    Read the article

  • SMTP server to deliver mail to Rails app, how?

    - by Gunchars
    all, this is my first question and I hope I chose the right place to post it. Here's what I need help with: I've been looking for this all day and I'm having a hard time finding a SMTP mail server that would fit the following criteria: lightweight, does one thing and does it good is able to route and deliver local mail to a Rails application The second point could be accomplished in any number of ways. I'm running a VPS, so I have full freedom in how to implement this. It could, for example, put messages straight in the db, pipe them to a helper program that would then process them accordingly or also save messages in a mbox file and run a script after every received message. I'm building a small site so the traffic is not going to be a problem. If there are alternative ways to deliver messages to a Rails app, I'd gladly hear about them. Thank you. EDIT: After long searching, I think I've found what I was looking for. Exim is a mail server that can deliver local mail to pipes. Also, Rails 3 and ActionMailer can make it really easy to process the incoming mail. More info here: http://www.exim.org/exim-html-current/doc/html/spec_html/ch29.html http://guides.rubyonrails.org/action_mailer_basics.html#receiving-emails

    Read the article

  • Using CSS3 is a bad practice? [closed]

    - by Qmal
    Possible Duplicate: Should I use HTML5 and/or CSS3 to build my website? I just want to know if it's considered as a "bad practice" to use things like rounded corners, gradients and so on... I understand that there are bots and crawlers that do not process CSS, but they don't need to. And nowadays most people use browsers that can process CSS3 with no problem. So should I make my buttons and shadows and such look pretty with CSS3 or with images?

    Read the article

  • In Linux, which tools are free to use to make Web site mockups?

    - by user11173
    I am using Ubuntu/Fedora. Which available mock-up builders i can use before making a website? Follow up: Adobe AIR for Linux is no longer supported. To access older, unsupported versions, please read the AIR archive. Different operating system? Downloaded: http://www.balsamiq.com/download Direct Links Mockups for Desktop: Cross-Platform: MockupsForDesktop.air Windows: MockupsForDesktop.exe Mac OSX: MockupsForDesktop.dmg Linux 32bit: MockupsForDesktop32bit.deb Linux 64bit: MockupsForDesktop64bit.deb Windows with Adobe Air bundled: MockupsForDesktopInstallerWin.zip (for offline installations).

    Read the article

  • Web stalker has purchased a domain name that uses my personal name, web page is defamatory [closed]

    - by Deborah Morse-Kahn
    We have been unsuccessful in persuading a stalker's website host to release the domain name he purchased which is my own personal name, e.g., PERSONALNAME.com. You will find my name below in the signature area. Look for yourself. On the one page that this domain name leades to is dreadful and defamatory material. No attorney has felt it worth their time to chase this issue down, and we cannot afford to go to a national or international dispute resolution group to bring this issue to WHOIS. Worse, the stalker is amoral and a psychopath: he would just love the attention. We've even consider trying to find someone to illegally hack into the webpage to at least redirect the domain pointers to my own professional website. This issue has continued now for two years and is affecting my professional reputations as potential clients have looked for me online. Is there any remedy? Your help and advice would be greatly welcomed.

    Read the article

  • Joomla: Cross site link boxes

    - by Dean Smith
    I am currently evaluating Joomla for use on the the rebuild of our corporate website. One of the features our designs have is spaces for what we've been calling widgets. These widgets include common ones for each page, contact us boxes, section navigation, that all seem pretty easy to implement. The other widgets we would us to highlight specific pages within the site relevant to the one you are on, and are therefore different for each article. We've called these link widgets and these are a pretty common site on many websites. I'm honestly at a bit of a loss as to how to do this with Joomla. From a CMS perspective I'd like to allow users to just select which three link widgets they want when editing the article and have some way of creating the widgets within the CMS as well. Editing the link widget would allow setting a title, a bit of text and selecting the page that it links to. Am I going to be able to do this with Joomla, and if I can roughly how to I go about it ?

    Read the article

  • Trouble with .htacess redirection

    - by mike23
    I use this redirect rule to redirect users from www.domain.com/admin to www.domain.com/wp-admin on a Wordpress site. RedirectMatch 301 \@admin http://www.domain.com/wp-admin The problem is that instead redirecting to wp-admin/, it redirects to an article called Administrators are awesome people (slug : administrators-are-awesome-people) I can guess what is going on, WP sees that there is an article slug starting with "admin", and redirects to it, overruling my own rule. Is there a way to be more specific, like saying "redirect urls that end with exactly admin ?

    Read the article

  • wget not respecting my robots.txt. Is there an interceptor?

    - by Jane Wilkie
    I have a website where I post csv files as a free service. Recently I have noticed that wget and libwww have been scraping pretty hard and I was wondering how to circumvent that even if only a little. I have implemented a robots.txt policy. I posted it below.. User-agent: wget Disallow: / User-agent: libwww Disallow: / User-agent: * Disallow: / Issuing a wget from my totally independent ubuntu box shows that wget against my server just doesn't seem to work like so.... http://myserver.com/file.csv Anyway I don't mind people just grabbing the info, I just want to implement some sort of flood control, like a wrapper or an interceptor. Does anyone have a thought about this or could point me in the direction of a resource. I realize that it might not even be possible. Just after some ideas. Janie

    Read the article

  • Object-based content management system

    - by Adam Maras
    I remember hearing within the last year or two about a content management system either being released or developed that was centralized around product/item information. I'm aware that there are several CMSes that have this capability, but this particular one was built specifically for that task. Also, I remember it winning some sort of award or recognition for upcoming software products. However, I can't for the life of me remember what this CMS was called or who was developing it. Does anyone know what package I'm talking about?

    Read the article

  • Allowed keywords for adwords [closed]

    - by Tom Gullen
    Possible Duplicate: Ok to target product names in adwords? I've replaced relevant words away from the real world without losing semantics which is a little difficult. A competitor is called "Box Maker" Can I target the keywords: "box maker" if my company is selling a tool to help you make boxes? Or is that disallowed? Would "Box Maker" the company be able to file a complain on Google? Would it go anywhere? The term 'box maker' gets a lot of searches and is an incredibly cost effective search to target

    Read the article

  • How can I redirect everything but the index as 410?

    - by Mikko Saari
    Our site shut down and we need to give a 410 redirect to the users. We have a small one-page replacement site set up in the same domain and a custom 410 error page. We'd like to have it so that all page views are responded with 410 and redirected to the error page, except for the front page, which should point to the new index.html. Here's what in the .htaccess: RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteRule !^index\.html$ index.html [L,R=410] This works, except for one thing: If I type the domain name, I get the 410 page. With www.example.com/index.html I see the index page as I should, but just www.example.com gets 410. How could I fix this?

    Read the article

  • How can I optimize Apache to use 1GB of RAM on my website? [closed]

    - by Markon
    My VPS plan gives me 1GB of RAM burstable to 2GB. Of course I cannot use 2 GB, nor 1 GB, everyday, so I'm planning to optimize the performance of my webserver. The average of hits-per-hour is about 8'000-10'000. This means about 2 connections-per-second. Max hits-per-hour reached until now is about 60'000. That means about 16 connections-per-second. Unluckily my current apache configuration uses too much memory (when there are not connected clients - usually during the night - it uses about 1GB) so I've tried to customize the apache installation to fit to my needs. I'm using Ubuntu, kernel 2.6.18, with apache2-mpm-worker, since I've read it requires less memory, and fcgid ( + PHP). This is my /etc/apache2/apache2.conf: Timeout 45 KeepAlive on MaxKeepAliveRequests 100 KeepAliveTimeout 10 <IfModule mpm_worker_module> StartServer 2 MinSpareThreads 25 MaxSpareThreads 75 MaxClients 100 MaxRequestsPerChild 0 </IfModule> This is the output of ps aux: www-data 9547 0.0 0.3 423828 7268 ? Sl 20:09 0:00 /usr/sbin/apache2 -k start root 17714 0.0 0.1 76496 3712 ? Ss Feb05 0:00 /usr/sbin/apache2 -k start www-data 17716 0.0 0.0 75560 2048 ? S Feb05 0:00 /usr/sbin/apache2 -k start www-data 17746 0.0 0.1 76228 2384 ? S Feb05 0:00 /usr/sbin/apache2 -k start www-data 20126 0.0 0.3 424852 7588 ? Sl 19:24 0:02 /usr/sbin/apache2 -k start www-data 24260 0.0 0.3 424852 7580 ? Sl 19:42 0:01 /usr/sbin/apache2 -k start while this is ps aux for php5: www-data 7461 2.9 2.2 142172 47048 ? S 19:39 1:39 /usr/lib/cgi-bin/php5 www-data 23845 1.3 1.7 135744 35948 ? S 20:17 0:15 /usr/lib/cgi-bin/php5 www-data 23900 2.0 1.7 136692 36760 ? S 20:17 0:22 /usr/lib/cgi-bin/php5 www-data 27907 2.0 2.0 142272 43432 ? S 20:00 0:43 /usr/lib/cgi-bin/php5 www-data 27909 2.5 1.9 138092 40036 ? S 20:00 0:53 /usr/lib/cgi-bin/php5 www-data 27993 2.4 2.2 142336 47192 ? S 20:01 0:50 /usr/lib/cgi-bin/php5 www-data 27999 1.8 1.4 135932 31100 ? S 20:01 0:38 /usr/lib/cgi-bin/php5 www-data 28230 2.6 1.9 143436 39956 ? S 20:01 0:54 /usr/lib/cgi-bin/php5 www-data 30708 3.1 2.2 142508 46528 ? S 19:44 1:38 /usr/lib/cgi-bin/php5 As you can see it use a lot of memory. How can I reduce it to fit to just 1GB of RAM? PS: I also think about the switch to nginx, if Apache can't fit to my needs...

    Read the article

  • Rewrite rule to show as directory using .htaccess

    - by chanchal1987
    I want to implement a rewrite rule in my .htaccess file to show a specific url as a directory of my server. See the code below I written, RewriteRule ^(.*)/$ ?page=$1 [NC] This will rewrites urls like www.mysite.com/abc/ to www.mysite.com/index.php?page=abc. But if I request www.mysite.com/abc then it is throwing an 404 error. How can I write a rewrite rule which will match www.mysite.com/abc and www.mysite.com/abc/ both? Edit: My current .htaccess file (After Litso's answer's 3rd revision) is like below: ## ErrorDocument 401 /index.php?error=401 ErrorDocument 400 /index.php?error=400 ErrorDocument 403 /index.php?error=403 ErrorDocument 500 /index.php?error=500 ErrorDocument 404 /index.php?error=404 DirectoryIndex index.htm index.html index.php RewriteEngine on RewriteBase / Options +FollowSymlinks RewriteRule ^(.+)\.html?$ $1.php RewriteCond !-d RewriteRule ^(.*)/$ ?page=$1 [NC,L] RewriteCond %{REQUEST_URI} !index.php RewriteRule ^(.*)$ ?page=$1 [NC,L] ##

    Read the article

  • Moving from a static site to a CMS with new URLs and meta-data for pages

    - by Chris J
    Hi I am in the process of rebuilding a site from static pages to a CMS which will be using mod_rewrite to generate new page URLs. In this process our marketing people and myself have decided to tidy up the descriptions, keywords and titles. Eg: a page which who's URL is currently "website-name/about_us.html" and has a title of "website-name - something not quite page specific" will change to "website-name/about-us/" and title: "about us - website-name" and may have a few keywords and the description changed. Our goal with updating the meta data is to improve our page rankings and try to keep in line with some best practices for SEO. Though our current page rankings are quite good in many aspects, there is room for improvement. All of the pages will also have content changes (like rearranging heading tags, new menu on all pages, new content in footer, extra pieces of dynamic content relating to other pages). In this new site process I plan to use 301 redirects for all the old URLs pointing to the new URLs. My question is what can I expect to happen to the page rankings in Google, in the sort term and long term? Will this be like kicking off a new site which will have to build up trust over time or will the original page rankings have affect?

    Read the article

  • Development Pipeline / Phases

    - by Chris
    Hey All, Im looking for a bit of advice ... I have been developing websites for quite sometime now, and i have now come to the stage where i want to run things properly, i am trying to put together a proper workflow for my projects. i have come up with the following and would love any feedback or additions i havent added. Discovery and Research Information Architecture Interaction Design Visual Design Site Development Quality Assurance Launch, Wine and Cheese Cheers,

    Read the article

  • Is it safe to Block These URLs with Robots.txt?

    - by Edgar Quintero
    I have a website that has all URLs optimized and 301 redirected from nasty URLs to clean ones. However, everywhere throughout the site the unclean URLs are linked in menus, content, products, etc. Google currently has all clean URLs indexed, along with a few unclean URLs too. So the site still has linked everywhere the old URLs (ideally this wouldn't be the case but this is how it is ATM). I would like to block the unclean URLs with robots.txt. The question: If I block these unclean URLs with the robots.txt, when the entire website is linked with them (but they all redirect to the clean version), will this affect the indexing status at all?

    Read the article

  • Removing spam external links after pharma hack?

    - by Beatchef
    Back in February my work's site was attacked by a pharma hack at the shared hosting end. I managed to find the placed file and the reference to run it in one of our files. I deleted this file, deleted and redownloaded all of the plugins and themes and reinstalled Wordpress. However I could never find the database entries no matter what I have read up on. Searching for known entries or for drug names backwards etc. On the Google and Bing end I have managed to deny and delete the entries and cache of most if not all of the bad links that the hack managed to instantly SEO to death (why don't these guys work legit and make more money?) However the one thing that is remaining is external links on the homepage that are invisible except when the site is viewed in google cache or scanned with unmaskparasites.com (and says that the external links are safe even though they're obviously not!). http://www.UnmaskParasites.com/security-report/?page=kmcharityteam.co.uk All sorts of website scans say there's nothing wrong with it and I can't find the source of the links in the header or footer or anywhere in the theme. I've searched for the links in the database but no use there either and they change every day so really I'd have to be looking for a generator? Does anybody have any advice or a solution for removing these links? Thanks!

    Read the article

  • Is there any advantage/disadvantage to using robots.txt to disallow access to legal pages such as terms, privacy policy, etc.?

    - by CaptainCodeman
    As I understand, having repetitive content is a detriment to search engine placement. Given that many websites that use similar or even identical "Terms and Conditions" and "Privacy Policy" pages due to similar legal wording or due to copy & pasting from the same source, would it be a good idea to disallow access to these pages via robots.txt, in order to avoid being penalized for "non-original content"? Or, on the contrary, could the search engines identify this as circumvention and penalize the site for trying to hide content? Or does it not matter?

    Read the article

  • Redirect public traffic to a different subfolder, while local traffic remains unchanged

    - by ecnepsnai
    I would like to have local (intranet) HTTP traffic go to the /var/www/html folder while any public traffic goes to the subfolder, /var/www/html/public I've tried this configuration, with some variation, in httpd.conf <VirtualHost PRIVATE-IP> DocumentRoot /var/www/html ServerName ecn ErrorLog /var/www/logs/error/private CustomLog /var/www/logs/access/private common </VirtualHost> <VirtualHost PUBLIC-IP> DocumentRoot /var/www/html/public ServerName PUBLIC-DOMAIN-NAME ErrorLog /var/www/logs/error/public CustomLog /var/www/logs/access/public common </VirtualHost> PUBLIC-IP, PRIVATE-IP, and PUBLIC-DOMAIN name are all replaced with the correct values in the actual document. The problem is, local traffic can browse fine but remote traffic is directed to the root folder and getting 403d (because I have that folder blocked off through my .htaccess file). If I append /public to the URL it works fine.

    Read the article

  • Google is not treating two Australian schools as separate sites when both are subdomains of qld.edu.au

    - by LuckySpoon
    My question relates to two websites, each of which is a "Calvary Christian College", however in two totally different locations and unrelated to each other entirely (except by name, and thus domain). All schools in the state are issued a <school-name>.qld.edu.au subdomain, in this case calvary.qld.edu.au and calvarycc.qld.edu.au. Now what's interesting is that these domains are crossing each other in sitelinks for searches such as calvary christian college townsville. The green data here is for one school (the Townsville school, as per search term), and the red data is for the other school. I've put a demotion in for this 6 months ago (we control calvary.qld.edu.au), however we're seeing no change on the results page. I have been able to get the owners of calvarycc.qld.edu.au to submit demotions for our domain, which should go in sometime in the next few days. What can we do to tell Google that these websites are not interchangeable, despite both appearing as "subdomains" of qld.edu.au? We can possibly open channels of communication with the administrators of qld.edu.au but will need to tell them what we need to change, and at this point I'm out of ideas.

    Read the article

  • Identify "non-secure" content IE warns about [on hold]

    - by Doug Harris
    As many know, if you serve a page over https and the content loads resources (images, stylesheets, js, SWF objects, etc) over http, older versions of Internet Explorer will show the user a warning saying "This page contains both secure and non-secure items". This is discomforting to many non-technical users. Usually, I can look at the HTML source and identify which item(s) are triggering this error. Sometimes a Flash object will load something else or some embedded javascript will put a new object in the DOM and trigger this. What tools are good for quickly tracking down the source of the warning?

    Read the article

  • Woolrich Prezzi che sono perfetti per tutte le donne

    - by WoolrichParka
    Gli strati Parka Woolrich Prezzi Woolrich sono realizzati con il 100% verso il basso e con la miscela remove ragionevole sull'area rivestimento esterno che tiene i siti per più innovativi clienti.L Woolrich Arctic strati sono sviluppati con la tonalità tradizionale che può essere anche in aggiunta al cappuccio Woolrich Parka e creare femmine in reasonable.Now prospettiva elegante, si mostra una vasta gamma Woolrich Outlet Bologna di disegni di Woolrich Parka Men per la vostra scelta, che sono tutti un valore di acquisto nel design e prezzo.wufengfengmaple36

    Read the article

  • Alternatives to Marin Software for ppc management? [closed]

    - by Skyao
    Does anyone have suggestion for ppc management tool similar to Marin Software but is much cheaper? Marin Software Enterprise charges a minimum of several thousand dollars per month. The functionality needed is as follows: Keyword creation and management - Campaign Management Automated bidding and roi tools - Reporting and analytics Ability to upload/download customized revenue data any suggestions would be appreciated..thanks

    Read the article

  • Redirect/Rewrite Subdomain to Subfolder

    - by Laurent Ho
    I'm trying to redirect a subdomain to a subfolder e.g. forums.domain.com to www.domain.com/forums Note that I started the forums in the subfolder format but worried that members might mistakenly try to access the forums using the subdomain format. RewriteCond %{HTTP_HOST} ^(www\.)?forums\.domain\.com RewriteRule .* /forums [L] From what I read the codes above should work through .htaccess, but do I still need to create a DNS A record to point to the IP address of the server?

    Read the article

  • Http header 302 error

    - by Katherine Katie
    Response Headers status HTTP/1.1 302 Found connection close pragma no-cache cache-control no-cache location / location /NKiXN/ I don't know how it became so but i used w3 total cache plugin but at this time i have deactivated it please help me to solve it, site is coming down in search engine ranking and google bot is unable to follow it. Urgent help required, if this is configuration problem with server please let me know the solution. Site: http://onlinecheapestcarinsurance.co.uk/

    Read the article

< Previous Page | 259 260 261 262 263 264 265 266 267 268 269 270  | Next Page >