Search Results

Search found 9728 results on 390 pages for 'zee pro'.

Page 167/390 | < Previous Page | 163 164 165 166 167 168 169 170 171 172 173 174  | Next Page >

  • Wordpress with user login and file manager support

    - by Don
    This may be a RTFM kind of thing, so I'll apologize up front. I've been asked by a friend I used to freelance for if there's a solution in Wordpress where users an login, then they can upload their own files in a "my docs" kind of thing. I've never used WP, so before I dig into their info I thought I'd see if anyone here can confirm or maybe point me to a resource. It's one of those "I'll look up at lunch and get back to you" things, which is why I'm bugging you all before reading the docs. Thanks

    Read the article

  • Keeping rackspace vserver alive

    - by mit
    It appears to me that rackspace somehow freezes cloud VMs after some idle time. This means the first page request to a php page takes much longer to respond than the subsequent requests. This is in some cases good, in other cases not acceptable. I am actually querying a machine with wget from a different host now to keep it "alive". But I wonder what frequency would be necessary. Does anyone know the time period after which they send a VM to "sleep"? I guess it would be some minutes. EDIT: There is absolutely no caching involved on the php site. It just recently moved from another vhost and there was never such latency on the first request.

    Read the article

  • Affect on speed of wordpress membership plugins -- currently trying s2member [migrated]

    - by Richard
    I'm taking a look at s2member -- I have it running, and my site is very slow -- it's taking on average about 9 or 10 seconds to load. This is the site: http://richardclunan.net I want to figure out if the s2member plugin is causing it to be slow. And whether there are other faster membership plugins... 3 questions: Are there particular settings or things specific to s2member that I should take care of to ensure s2member doesn't make my site slow? If I deactivate the plugin to test the speed of the site with the plugin deactivated, will that mean I'll have to respecify s2member settings when I reactivate it? After it's reactivated will members' accounts work ok? Anybody have observations on s2member or other wordpress membership site plugins and their affect on site speed?

    Read the article

  • Images not indexed by google since moving to cdn

    - by dfunkydog
    Last week I moved all the images on coffeeandvanilla.com to a cdn( maxcdn.coffeeandvanilla.com ). The problem I'm having is that although the sitemap—generated by yoast wordpress seo plugin—points images to the correct location, google only indexes[sic] images from the category and page site maps but 0 images from the posts sitemap( see screenshot https://dl.dropbox.com/u/4635252/sitemap.png ) This website has been doing quite well with google image-search before the change, visits from google image search have dropped from ~200/day to 11 yesterday Here is an example entry from the generated posts.xml sitemap http://pastebin.com/vcMRf9VW Can anyone suggest where the problem lies? Why have I lost all my google image juice? Should I just wait some more, how long before really worrying?

    Read the article

  • SQL Server Transaction Log Fragmentation: a Primer

    Generally, you will have no need to worry about the number of virtual log files in your transaction log. However, if you use the default settings for 'auto-grow', you can end up with such 'fragmentation' in your transaction log as to affect performance noticably. How can this be avoided? How can you tell it's a problem? What do you do about it? Greg explains. "SQL Backup Pro 7 improves on an already wonderful product" - Don KolendaHave you tried version 7 yet? Get faster, smaller, fully verified backups. Download a free trial of SQL Backup Pro 7.

    Read the article

  • Google analytics is counting way to much

    - by Luticka
    I have a website using Google analytics but it is counting way to much. To test this i was logging all entry's to my database with time and IP address. My result for one day was: Google analytics: Visits: 4078 Absolute Unique Visitors: 3758 My Database: Visits: 4182 Unique Visitors(Only by IP): 905 I use the tracking option "One domain with multiple subdomains" because the website is accessible both on www.example.com and example.com. I'm i missing something or what could be wrong?

    Read the article

  • GeoTrust SSL brand name used by re-sellers

    - by Christopher
    I feel like a I got the bait-and-switch from my web host provider since they advertise "GeoTrust SSL" for $99. I purchased it, thinking the certificate is issued from geotrust.com, but then I get an email from Comodo saying they are providing it. My host provider says they get a discount by using Comodo. I purchased the certificate with the understanding it would be issued by GeoTrust. I called the host provider and they said they usually expect it from GeoTrust, but someone from email support responded saying the product name is "GeoTrust SSL" but they use Comodo to get a discount. I think this is bogus and unfair trade practice. However, searching for "GeoTrust" on google brings up a ton of websites selling "GeoTrust" certificates. How can companies get away with this? Since the host provider is part of BBB I plan to inform my host to update the purchase page on their website to state clearly that... "This certicate is provided at a discount and may be issued by a provider other than GeoTrust.com, such as Comodo.com" Any feedback on this is appreciated.

    Read the article

  • Redirect before rewrite

    - by Kirk Strobeck
    Had an issue where I need to redirect old URLs, but not disable the mod_rewrite for page structure. redirect 301 /home.html http://www.url.com/ It needs to live on the Symphony 2.0 .htaccess file ### Symphony 2.0.x ### Options +FollowSymlinks -Indexes <IfModule mod_rewrite.c> RewriteEngine on RewriteBase / ### DO NOT APPLY RULES WHEN REQUESTING "favicon.ico" RewriteCond %{REQUEST_FILENAME} favicon.ico [NC] RewriteRule .* - [S=14] ### IMAGE RULES RewriteRule ^image\/(.+\.(jpg|gif|jpeg|png|bmp))$ extensions/jit_image_manipulation/lib/image.php?param=$1 [L,NC] ### CHECK FOR TRAILING SLASH - Will ignore files RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !/$ RewriteCond %{REQUEST_URI} !(.*)/$ RewriteRule ^(.*)$ $1/ [L,R=301] ### ADMIN REWRITE RewriteRule ^symphony\/?$ index.php?mode=administration&%{QUERY_STRING} [NC,L] RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^symphony(\/(.*\/?))?$ index.php?symphony-page=$1&mode=administration&%{QUERY_STRING} [NC,L] ### FRONTEND REWRITE - Will ignore files and folders RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^(.*\/?)$ index.php?symphony-page=$1&%{QUERY_STRING} [L] </IfModule> ######

    Read the article

  • What marketplace / garage-sale software package does togoparts.com use?

    - by gus
    See: OpenSource Marketplace Platform I want to start a site also for end-users to buy/sell used sporting goods of a particular type. When the scope of goods is narrowed like this, it is very advantageous to be able to filter by Brand, Size, Price Range, etc. Nice features: account reputation with user comments listings sortable by many custom fields auto resize and recompress image uploads I don't want to reinvent the wheel, so does anyone know where I can start?

    Read the article

  • Chrome says my website is Serbian...how do I "force" chrome to use english?

    - by kristen
    Our website is based in the U.S. and all of our users are in the U.S. and the site is written in english. When we open the page in chrome, an alert comes up "This page is in Serbian...would you like to translate it"? I imagine it is some javascript or other code that is triggering that. Is there a way to force chrome to use "English" as the language? We tried googling this question but came up with nothing. thx!

    Read the article

  • How can I stop a bot attack on my site?

    - by tnorthcutt
    I have a site (built with wordpress) that is currently under a bot attack (as best I can tell). A file is being requested over and over, and the referrer is (almost every time) turkyoutube.org/player/player.swf. The file being requested is deep within my theme files, and is always followed by "?v=" and a long string (i.e. r.php?v=Wby02FlVyms&title=izlesen.tk_Wby02FlVyms&toke). I've tried setting an .htaccess rule for that referrer, which seems to work, except that now my 404 page is being loaded over and over, which is still using lots of bandwidth. Is there a way to create an .htaccess rule that requires no bandwidth usage on my part? I also tried creating a robots.txt file, but the attack seems to be ignoring that. #This is the relevant part of the .htaccess file: RewriteCond %{HTTP_REFERER} turkyoutube\.org [NC] RewriteRule .* - [F]

    Read the article

  • Multiple domains

    - by menardmam
    I got 6 domains : company.ca and company.com (because both where free, but we are a canadian company but can do business with the rest of the world). Then, we sell sportwear because of the company name is totally unknown to the world. Our product is we have bought product specific domain : chandails.ca and t-shirt.ca as well as shorts.ca and shorts.com. So those 6 domains are mine. Now what is the best way to do? Now all are 301 redirect to the main company name (.com) or make micro-site, super simple one page optimized for just shirt and one for shorts, then tell people to know more, go to the main site. Because now, I cannot really find the benefit of the search word in domain name edge if never somebody see something in that domain... I got confused and don't find strait answer to this question.

    Read the article

  • Google adsense - providing access (via an additional account?) to a third party

    - by Homunculus Reticulli
    I am working with a partner who will be handling the marketing side of things for one of my websites. He has informed me that he will require access to my adsense account. I need to create an additional account for him, so that he can access and manage Google Adwords/units etc, using his own login credentials. However, despite searching Google for a while now, I can't seem to locate any information that pertains to creating additional user accounts. Does anyone know how I may do this?

    Read the article

  • FFmpeg & Installation on phpmyadmin

    - by Vivek
    I am attempting to have an interface in which people can upload music files and listen to them through the site. The biggest problem obviously is that someone who uploads an audio track in mp3 format into Mozilla wouldn't be able to play it back (since MF doesn't support mp3 playback since I'm using jPlayer). I did some research and found out that I could use command line php using FFmpeg to convert the mp3 to ogg or some other supportable format. I believe I understand (a little bit) how command line php works but I was wondering how I could install it onto phpmyadmin on my hosting service? Could anyone link me to a tutorial or care to explain? I tried googling it but I just couldn't find it.

    Read the article

  • How can I get cross-browser consistent behavior for TR heights within a table with a set height? [migrated]

    - by Dan
    I have an arbitrary number of tables with an arbitrary number of rows in each, and all tables are the same height. My initial approach was to just set the overall height of the table and hope the rows were smart enough to distribute themselves appropriately. That's not the case. I have 4 different behaviors going on with 4 browsers, but I need them to all render at the very least in a similar way. Safari & Chrome (WebKit): All rows are equal height, creating scroll bars as needed and fitting within table height. Firefox: All rows are the height necessary to fit their content, with the remaining rows overflowing out of the table. Additionally, If the content of the rows does not take up all of the height, only the part of the table with content in it takes the background (though it seems, through use of Firebug, that the actual table [and TR] extend to the bottom of the proper table height). IE: All rows are the height necessary to fit their content, with the remaining rows overflowing out of the table. Obviously this only includes one version of each browser and additional variation would likely appear with more being tested. Ideally, a solution where the browser renders TRs with less content smaller than those with larger content, while still using scrolling within the variable height TRs when the overall height of the table is not enough would be optimum. I could potentially see a solution to achieve that with JS, but can it be done with CSS? Or, if not, can the behavior that WebKit displays be made to work across the browsers? Thanks! PS: Example can be found here.

    Read the article

  • How to filter traffic coming to particular page from other page?

    - by BishKopt
    I've got page A linking to page B. There are also other pages linking to B. How can I see traffic that is coming to page B be ONLY form page A? I can somehow do it via Behavior flow: Behavior Behavior flow [Right click on anything] Explore traffic through here [Click edit icon] Define a page group [Right click] Group details [Dropdown] Incoming traffic But how do I do it in normal reports? Is there any way to filter out only the traffic coming from particular page?

    Read the article

  • Adsense ads are not a good fit for my site

    - by Ryan Grush
    I run an academic network for college students to communicate at particular universities and we run Google Adsense. The site pulls in a decent amount for a side project but our CTR is horrible <0.2% and our RPM is equally low. The problem lies in the fact that Google pegs us as an education site (which we are) but shows our users ads for U of Phoenix, Devry U and other for-profit universities. All of our users are students of the more higher-caliber institutions and therefore have no use for these ads. I've known about this problem for some time but I don't know what to do to show more relevant ads instead (i.e. Spring Break, school apparel, poker, sports, etc). What would be the best way to change this?

    Read the article

  • Why googling by keycaptcha gives results on reCAPTCHA? [closed]

    - by vgv8
    EDIT: I'd like to change this title to: How to STOP Google's manipulation of Google search engine presented to general public? I am frequently googling and more and more frequently bump when searching by one software product I am given instead the results on Google's own products. For ex., if I google by keyword keycaptcha for the "Past 24 hours" (after clicking on "Show search tools" -- "Past 24 hours" on the left sidebar of a browser) I am getting the Google's search results show only results on reCAPTCHA. Image uploaded later: Though, if confine keycaptcha in quotes the results are "correct" (well, kind of since they are still distorted in comparison with other search engines). I checked this during few months from different domains at different ISPs, different operating systems and from a dozen of browsers. The results are the same. Why is it and how can it be possibly corrected? My related posts: "How Gmail spam filter works?" IP adresses blacklisting Update: It is impossible for me to directly start using google.com as I am always redirected to google.ru (from google.com) by my ip-address "auto-detect location" google's "convenience". The google's help tells that it is impossible to switch off my location auto-detection because it is very helpful feature. There is a work-around to use google.com/ncr (to get google.com) (?anybody know what does it mean) to prevent redirection from google.com but even. But all results are exactly the same OK, I can search by quoted "keycaptcha", I am already accustomed to these google's quirks, but the question arises why the heck to burn time promoting someone's product if GOOGLE uses other product brands for showing its own interests/brands (reCAPTCHA) instead and what can be done with it? The general user will not understand that he was cheated and just will pick up the first (wrong) results Update2: Note that this googling behaviour: is independent on whether I am logged-in (or log-out-ed of) a google account, which account, on browser (I tried Opera, Chrome, FireFox, IE of different versions, Safari), OS or even domain; there are many such cases but I just targeted one concrete restricted example speciffically to to prevent wandering between unrelated details and peculiarities; @Michael, first it is not true and this text contains 2 links for real and significant results.. I also wrote that this is just one concrete example from many and based on many-month exp. These distortions happen upon clicking on: Past 24 hours, Past week, Past month, Past year in many other keywords, occasions/configurations of searches, etc. Second, the absence of the results is the result and there is no point to sneakingly substitute it by another unsolicited one. It is the definition of spam and scam. 3d, the question is not abt workarounds like how to write search queries or use another searching engines. The question is how to straighten the googling's results in order to stop disorienting general public about. Update: I could not understand: nobody reproduces the described by me behavior (i.e. when I click "Past 24 hours" link in google search searching for keycaptcha, the presented results are only on reCAPTCHA presented)? Update: And for the "Past week":

    Read the article

  • Chrome trims the last <li> element in a row [closed]

    - by Paul
    Ok guys, I give up. Here's the code I'm struggling with: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html;charset=utf-8" /> <title>Blah</title> <style type="text/css"> #container { margin: 0 auto; width: 350px; border: 1px solid #ccc; } ul { list-style-type: none; margin: 0; padding: 0; text-align: center; } ul li { display: inline; padding: 5px; margin: 0 1px; background-color: lime; line-height: 2em; /* border: 1px solid red; */ } </style> </head> <body> <div id="container"> <ul> <li>Element A</li> <li>Element B</li> <li>Element C</li> <li>Element D</li> <li>Element E</li> <li>Element F</li> </ul> </div> </body> </html> Why the heck does Chrome trim the right side of "Element D" (even though there is enough space to display whole item), while Firefox and even Internet Explorer render this code properly? It becomes more visible when we apply the commented border. In other words, is there a way to tell the browser that I want every single <li> element to be autonomic, and thus to move it to the next row if it doesn't fit entirely in the previous one? Can't wait to see the solution, thanks in advance.

    Read the article

  • Tracking first visit date in Google Analytics

    - by dusan
    I want track a site's visitor retention using Google Analytics, to see if unregistered users are returning to it, within a time frame of 2+ months from now. This blog post seems to be on the right track, but I want to track unregistered users, so I don't have a "join date" or similar variable at my disposal. This other blog post suggests using all 5 GA custom variables, using the first variable slot on the first week, variable 2 on week 2, etc. This method will allow me to track 5 weeks of visitors. I want to track more than 5 weeks of visitors, so I was thinking on using two custom variables in GA: visitor's first visit date, and visitor's last visit date. How I can save the first visit date? Because if I save another value in the same slot the old value will be overwritten, and I don't know how reliable is to save that variable conditionally (reading the __utmv cookie to check whether the "first visit date" is set, if it isn't set I save the current date)

    Read the article

  • URL hex characters in .htaccess

    - by Steve
    There is an old page with a space in the filename, and this is no longer found on the website. So I need to redirect this page to another page using a 301 redirect in .htaccess. If I place the filename directly into .htaccess (Bouquets%20%26%20Loose.html), the redirect does not work. If I escape the % sign like this (Bouquets\%20\%26\%20Loose.html), the redirect still does not work. How do I get this redirect to work in .htaccess? Thanks.

    Read the article

  • Password protect an alias virtual difrecory

    - by Jason
    I have a main domain being hosted through CPanel. I also have a sub-domain that I would like to appear as a path under the main domain instead of as a sub-domain. So I have: http://example.com/ pointing to the main hosted file. http://example.com/mydir pointing to the subdomain files. This is achieved by a httpd.conf include from the main domain section to set an alias: alias /mydir /path/to/subdomain/files/ Now, that works fine so far. The problem is that if a .htaccess file under /path/to/the/subdomain/files/ contains an error, the alias is completely skipped, and /mydir goes instead to the main host files. That is kind of surprising to me - I would expect an error to return an error instead. Now the killer: if I try to password protect /path/to/subdomain/files/, then trying to access http://example.com/mydir will again attempt to deliver from under the main hosted files and not from /path/to/subdomain/files/ I am not seeing any errors reported on the .htaccess file in the apache error log, so I am assuming the .htaccess is valid: AuthUserFile /path/to/valid/readable/.htpasswd AuthName "Secure Access" AuthType Basic Require valid-user This kind of behaviour does not seem right to me. Is there something obvious that could be causing it? Or is this just the way it works? Perhaps using an alias is the wrong way to go?

    Read the article

  • how to get new site indexed by alexa

    - by JohnMerlino
    When I search my site on alexa, it says "Alexa Traffic Rank: No Data". So I google the issue. I come across this page: http://www.loudable.com/my-website-data-not-showing-in-alexa-get-your-website-crawled-by-alexasolution.html It says to get site indexed, click Crawl my site on the webmasters page. However there is no longer a link that says "Crawl my site". So as of now, does anyone know how to get site indexed by alexa so that my traffic rank will display on alexa index?

    Read the article

  • Desktop Software to monitor online status of web site and web-based application

    - by pansp
    I'm basically looking for a desktop-based software which can monitor my company's website and the web application's online availability. I know there are few online applications like Uptime Robot which does the same work but I have been asked to find a desktop based software which can monitor running in system tray and notify any down-time. A free software would be great. Any help would be appreciated. Thanks!

    Read the article

  • Creating date based back entries for a blog and its site registration

    - by open_sourse
    So I am showing a blog to a colleague and telling him how the author has been regularly blogging for over ten years now. My colleague tells me that anyone can register a domain name and start entries from say circa 2000. When I argued that the site registration date can easily show that the registration was done recently he put forward two arguments: The author can claim that he moved from an old domain name which was registered many years ago and lapsed. So he took the data and rebuilt it in the new site. The author can buy an expired domain which was on the internet for many years. I am not sure if these ways can work to con someone to believing you have been a blogger for over a decade. But I do not have enough expertise in the topic to refute him. So I thought I would ask the wise community here at StackExchange. Can anyone give me some insight?

    Read the article

< Previous Page | 163 164 165 166 167 168 169 170 171 172 173 174  | Next Page >