Search Results

Search found 10402 results on 417 pages for 'macbook pro'.

Page 86/417 | < Previous Page | 82 83 84 85 86 87 88 89 90 91 92 93  | Next Page >

  • Facebook - Filter Page posts by #hashtag [on hold]

    - by beppe9000
    I'm trying to gather all the posts (official posts and people's posts) on my page which contain a specified #tag, to later show them on website. But I've no clue on how to accomplish this. Is there any API capable of this or anything else that could help? I basically need to get all those posts IDs looping spidering trought them for my hashtag. I'm planning to do this server-side so PHP is my choice.

    Read the article

  • Writing files in a sub folder of the web folder (apache security)

    - by Homunculus Reticulli
    I need to save session data for a dynamic web page script by writing to file. I have two questions: Are there any security preferences as to whether to save the data UNDER the web folder, or OUTSIDE the web folder? I attempted to write to the folder an (unsuprisingly), I had a 'file permission refused' type error. Should I set the folder ownership to the apache user (600, 640 or 644?) [[Edit]] core <- 'OUTSIDE' web folder (php script live here) data <- 'OUTSIDE' web folder (session data and other misc data resides here) web <- web root folder js <- any folder below is 'INSIDE' the web folder css html For example, in a php script (i.e. a dynamic PHP page), I can attempt to write to a file using something like fput('../data',data) yet (as I understand it) ../data should not be accessible - for security reasons. Could someone please provide a simple example that shows how to provide access to ../data/ in the example given above?. What are the actual SPECIFIC steps required? BTW, I am running on a LAMP stack.

    Read the article

  • Weird CSS-behaviour [migrated]

    - by WMRKameleon
    <!DOCTYPE html> <html> <head> <title>PakHet</title> <link rel="stylesheet" type="text/css" href="css/basis.css" /> </head> <body> <div class="wrapper"> <div id='cssmenu'> <ul> <li class="active"><a href='index.html'><span>Start</span></a></li> <li><a href='pakhet.html'><span>Over PakHet</span></a></li> <li><a href='overons.html'><span>Over Ons</span></a></li> <li class='has-sub '><a href='#'><span>Uw pakket</span></a> <ul> <li><a href='aanmelden.php'><span>Aanmelden</span></a></li> <li><a href='traceren.php'><span>Traceren</span></a></li> </ul> </li> </ul> </div> <div class="header"> <h1>Hier komt de titel van de website</h1> </div> <div class="content"> <p>Dit is de tekst van de content. Dit is de indexpagina.</p> </div> </div> </body> </html> And this is the CSS: /* CSS RESET */ html, body, div, span, applet, object, iframe, h1, h2, h3, h4, h5, h6, p, blockquote, pre, a, abbr, acronym, address, big, cite, code, del, dfn, em, img, ins, kbd, q, s, samp, small, strike, strong, sub, sup, tt, var, b, u, i, center, dl, dt, dd, ol, ul, li, fieldset, form, label, legend, table, caption, tbody, tfoot, thead, tr, th, td, article, aside, canvas, details, embed, figure, figcaption, footer, header, hgroup, menu, nav, output, ruby, section, summary, time, mark, audio, video, *{ margin: 0; padding: 0; border: 0; vertical-align: baseline; } table { border-collapse: collapse; border-spacing: 0; } /* Einde CSS RESET, nu echte code */ html, body{ background:url(../images/bg_picture.jpg) fixed no-repeat; } .wrapper{ margin:0 auto; } .header{ margin:0 auto; background-color:rgba(0, 0, 0, 0.4); } .content{ background-color:rgba(0, 0, 0, 0.4); width:600px; margin:0 auto; margin-top:50px; } .content p{ color:white; text-shadow:1px 1px 0 rgba(0,0,0, 0.5); font-family:"Lucida Grande", sans-serif; } #cssmenu{ height:37px; display:block; padding:0; margin: 0; border:1px solid; } #cssmenu > ul {list-style:inside none; padding:0; margin:0;} #cssmenu > ul > li {list-style:inside none; padding:0; margin:0; float:left; display:block; position:relative;} #cssmenu > ul > li > a{ outline:none; display:block; position:relative; padding:12px 20px; font:bold 13px/100% "Lucida Grande", Helvetica, sans-serif; text-align:center; text-decoration:none; text-shadow:1px 1px 0 rgba(0,0,0, 0.4); } #cssmenu > ul > li:first-child > a{border-radius:5px 0 0 5px;} #cssmenu > ul > li > a:after{ content:''; position:absolute; border-right:1px solid; top:-1px; bottom:-1px; right:-2px; z-index:99; } #cssmenu ul li.has-sub:hover > a:after{top:0; bottom:0;} #cssmenu > ul > li.has-sub > a:before{ content:''; position:absolute; top:18px; right:6px; border:5px solid transparent; border-top:5px solid #fff; } #cssmenu > ul > li.has-sub:hover > a:before{top:19px;} #cssmenu ul li.has-sub:hover > a{ background:#3f3f3f; border-color:#3f3f3f; padding-bottom:13px; padding-top:13px; top:-1px; z-index:999; } #cssmenu ul li.has-sub:hover > ul, #cssmenu ul li.has-sub:hover > div{display:block;} #cssmenu ul li.has-sub > a:hover{background:#3f3f3f; border-color:#3f3f3f;} #cssmenu ul li > ul, #cssmenu ul li > div{ display:none; width:auto; position:absolute; top:38px; padding:10px 0; background:#3f3f3f; border-radius:0 0 5px 5px; z-index:999; } #cssmenu ul li > ul{width:200px;} #cssmenu ul li > ul li{display:block; list-style:inside none; padding:0; margin:0; position:relative;} #cssmenu ul li > ul li a{ outline:none; display:block; position:relative; margin:0; padding:8px 20px; font:10pt "Lucida Grande", Helvetica, sans-serif; color:#fff; text-decoration:none; text-shadow:1px 1px 0 rgba(0,0,0, 0.5); } #cssmenu, #cssmenu > ul > li > ul > li a:hover{ background:#333333; background:-moz-linear-gradient(top, #333333 0%, #222222 100%); background:-webkit-gradient(linear, left top, left bottom, color-stop(0%,#333333), color-stop(100%,#222222)); background:-webkit-linear-gradient(top, #333333 0%,#222222 100%); background:-o-linear-gradient(top, #333333 0%,#222222 100%); background:-ms-linear-gradient(top, #333333 0%,#222222 100%); background:linear-gradient(top, #333333 0%,#222222 100%); filter:progid:DXImageTransform.Microsoft.gradient( startColorstr='#333333', endColorstr='#222222',GradientType=0 ); } #cssmenu{border-color:#000;} #cssmenu > ul > li > a{border-right:1px solid #000; color:#fff;} #cssmenu > ul > li > a:after{border-color:#444;} #cssmenu > ul > li > a:hover{background:#111;} #cssmenu > ul > li.active > a{ color:orange; } .header{ clear:both; } The problem is that, whenever I hover on the dropdown-menu, that a 1px margin appears in between the menu and the header. Can I solve that? I can't seem to find the solution.

    Read the article

  • Why deny access to website for msnbot/bingbot?

    - by Quandary
    I've seen quite a lot of tutorials that recommend you to ban user agents containing the strings libwww-perl and msnbot. I understand why one would ban libwww-perl, it's mainly if not only used for hacking and spamming. But why are there so many sites recommending to ban msnbot/bingbot? Since it's a search engine, even if only with a marginal market share, I would except one would want this bot to crawl one's sites. What is it that msnbot does that makes people ban it?

    Read the article

  • Best Free software for hosting user guides

    - by Hippyjim
    Hi All After having to clean up spam from a MediaWiki install for the umpteenth time, despite a recaptcha plugin "preventing" automated signups, I'm wondering if MediaWiki is the right choice as a CMS for hosting user manuals and guides. I've always loved the way wikis can let the guides be edited and commented on collaboratively, but I'm getting tired of dealing with automated vandals. I've disabled edits & signups for now, but as I'm having to go through the pain of cleaning thousands of junk pages, I'm beginning to think I should cut my losses and look for a better alternative. Does anyone know of suitable a FOSS application (preferably PHP / MySQL based) that would be simple for a non-coder (our manual writer) to edit, but that has all the interconnectivity, and searchability of a wiki? Or should I just bite the bullet again and lock the wiki down even further?

    Read the article

  • Can I reuse my nameservers from one domain registrar with another?

    - by Nikki Erwin Ramirez
    My regular domain is one I got from GoDaddy. Just recently, I registered a short .cr domain (Costa Rica) in http://www.nic.cr/ . During registration, they asked for nameservers (and just nameservers), so I thought of reusing my GoDaddy nameservers. I kinda thought it would just be a straight-forward mapping, but nothing's happening, though. What am I missing here? (There is an option to use their own nameservers, but I just wanted to explore this option. If there's nothing to be had here, I'll fall back to using theirs.)

    Read the article

  • Is it fine to put different category of stuff in a single domain name?

    - by Fahad Uddin
    I own a website which is regarding startups and finance. I am looking forward to work on Wordpress programming in which I would be selling wordpress themes. I thought of buying a domain name for Wordpress website but it takes quite lot of time to setup a website and then do its SEO. Is it fine(in terms of SEO and professionalism) to put the Wordpress category inside my old domain like, Domain: www.startupsandfinance.com Wordpress domain, www.wp.startupsandfinance.com

    Read the article

  • Huge google impression drop after cleaning html

    - by olgatorresfoundation
    Good morning, I am the webmaster of a non-profit organization that donates grants to colorectal cancer research projects and funds various colorectal cancer information campaigns. We have three domains: www.fundacioolgatorres dot org (Catalan) www.fundacionolgatorres dot org (Spanish) www.olgatorresfoundation dot org (English) So what happened? I redesigned olgatorresfoundation on the 20th and the fundacionolgatorres on the 30th of May. In both cases, exactly two days later, the number of impressions on both dropped to a halt. Granted, we did not have the traffic of Microsoft, but a 90% decrease a disaster of incredible proportions for us. My only real changes were cleaning up the old ineffective HTML to a cleaner form (mostly moving away from redundant table construction to a table-less view). Here is a before and after snapshot of what the change looks like: Before: http://www.fundacioolgatorres.org/aparell_digestiu/introduccio/ (unchanged page in Catalan) After: http://www.olgatorresfoundation.org/digestive_system/introduction/ (changed page in English) Anybody has a clue to what just happened? Why should a normal, sane html improvement be punished and so dramatically? No URLs have been changed, neither have page names or descriptions. Possible secondary question: If it is so that Google sees it as a major overhaul and decides to drop the pagerank sharply, does it come back to pre-change levels if the content "checks out" or will the page start over from scratch earning those pagerank points (which would mean that we would have to wait 6 months for the pages to recover to the level they had two weeks ago)? (duplicated from productforums.google dot com/forum/#!category-topic/webmasters/crawling-indexing--ranking/YsnyX0JzOpY, hoping to reach a wider audience)

    Read the article

  • Web application stopped behaving normally after migration from Dedicated servers to Ec2 servers

    - by sunny
    Web application stopped behaving normally after migration from Dedicated servers to Ec2 servers Old Dedicated Server configuration systemtype: 32 bit operating system. RAM: 3.99 Gb Ram Processer : 1.86 GHZ New Servers in EC2 systemtype: 64 bit operating system. RAM: 3.99 Gb Ram Processer : 2.73 GHZ - 2.31 GHZ Everthing is working fine in our production server. But as we migrated our web application from old servers to new servers and transferred the entire network traffice to new servers.Site suddenly stopped behaving abnormally. Sometimes it's super fast Some times slow. sometimes normal some times super slow and sometimes no response This all above happens with a time interval or around 2 - 3 minutes. This went on happening 8 - 10 hours. Few differences in old and new servers are Old servers are using II6 and New servers are using IIS 7.5. We are using exactly the same code in the old and new servers. Even the ec2 servers are having higher CPU then older servers but still having lower. But not sure how this is happned. Please suggets your views...

    Read the article

  • CSS variable height columns [migrated]

    - by Rob
    I have created a website, www.unionfamilies.com. I have a header, a main section consisting of two columns, and a footer. Currently, I have specified heights for header, main, footer, etc. I would like to make the site so the header stays on top, the columns adjust to match the height of the longest column, and the footer stays at the bottom. Can someone help me with this? I am new to CSS, so please be patient. Thank you.

    Read the article

  • Blogger still visible after moving to WP; Google Indexing issues after moving from Blogger to WP

    - by Erin
    I recently migrated from Blogger to Wordpress and am having two major transition issues that are really hurting. Despite literally hours of searching and experimenting, I cannot resolve the following: ISSUE ONE: I fixed all of my old blogger links to 301 redirect successfully to my WP links (the 2 structures are different and I realized too late), but my old blogger blog is still sometimes visible! (the 2 designs are completely different) I had 31 hits on my blogger site just yesterday. I have updated my privacy settings to hide my blogger blog from search engines and not be visible on blogger. I also removed my custom domain from blogger already as well. HELP! Not sure how to stop this. ISSUE two: Despite submitting a new site map and reindexing my pages for my WP blog, I am not visible in search engines, although I was very visible previously. In fact, some of my OLD links are showing up. Am I being penalized?? Any thoughts on how to fix. THANK YOU! Erin my site: www.thelawstudentswife.com

    Read the article

  • Weird .ASP pages from my non-ASP site generating 404s

    - by Amanda
    In Google Webmaster Tools, I have a huge list of "Not Found" crawl errors (404) with URLs that look like this: http://www.exclusivevillas.co.za/villa_view.asp?vSeq=82&activitySeq=3&page=3, seemingly originating from URLs very similar to that (eg http://www.exclusivevillas.co.za/villa_view.asp?vSeq=82&activitySeq=3&page=4. Thing is, the site is WordPress. Has been for almost a year now. Was plain html before that. I don't know where these ASP requests are coming from. And furthermore, the dates these supposed ASP pages requested these other ASP pages, resulting in 404s, are very recent. What's going on?

    Read the article

  • Transfer websites and domains to new server

    - by Albert
    We have currently around 40 websites and 80+ domains/sub-domains in a shared 1&1 hosting package, and we just acquired a managed dedicated server with 1&1 as well. Now it's time to start transferring everything over to the new server. Transferring just the websites and databases wouldn't be a problem, it would take time but it's pretty straight forward. The problem comes when transferring the domains, let me explain why. Many of the websites we have are accessible via sub-domains of a parent domain. Ideally, we would like to transfer the sites one by one, in order to check for each one that everything works fine in the new server. However, since we also need to transfer the domain so it's managed in the new server, once we do that means that all the websites using that domain need to be already in the new server before transferring that domain, thus not allowing the "one by one" philosophy. Another issue is the downtime when transferring the domain, from the moment it stops working in the hosting package and becomes active in the new server. I believe there's nothing we can do here. So my question is if there's any way we can do the "one by one" transferring of the websites (and their corresponding sub-domains) in the circumstances described above. One idea I had would be: 1. Let's say we have website A, which is accessible using subdomain.mydomain.com (and there are many other websites accessible via other sub-domains of mydomain.com) 2. Transfer the files of website A to the new server 3. Point a test domain in the new server to the website A's folder (the new server comes with a "test" domain) 4. Test if website A works with that "test" domain 5. In the old hosting, somehow point the real sub-domain (subdomain.mydomain.com) to the new location of website A, in a way that user always see the same URL as always 6. Repeat 2-5 for every website belonging to the same domain 7. Once all are working in the new server, do the actual transfer of the domain to the new server, and then re-create all the sub-domains and point them to their corresponding website That way, users wouldn't notice that there's been a change (except for a small down time of the websites when doing the domain transfer). The part I'm not sure about is point 5 of the above. Is there any way to do that? I mean do it in a way that users see the original domain all the time in their browser, even for internal pages (so not only for the "home page", which would be sub-domain.mydomain.com, but also for example for the contact page, which would be sub-domain.mydomain.com/contact.php). Is there any way to do this? Or are we SOL and we're going to have to transfer all at the same time?

    Read the article

  • Clean URLs issue using .htaccess in PHP project

    - by x4ph4r
    I am working on a PHP laravel project. I am currently facing issues with .htaccess file. I have following .htaccess: <IfModule mod_rewrite.c> Options -MultiViews RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^ index.php [L] </IfModule> When I reload my page the it gave me following error: 404 Not Found The requested URL /contacts was not found on this server. Then I opened /etc/apache2/users/username.conf file which had following line of code: <Directory "/Users/username/Sites/"> Options Indexes MultiViews AllowOverride None Order allow,deny Allow from all </Directory> In above code I changed AllowOverride None to AllowOverride All. Then I reload page and got following error: 403 Forbidden You don't have permission to access /contacts on this server. When I add FollowSymLinks to .htaccess file Options such as like this Options -MultiViews FollowSymLinks. Then sometimes I get this 500 Internal Server Error error and sometime this *Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data*. Each time I reload my page one of these errors with FollowSymLinks option. I also uncomment following lines in /etc/apache2/httpd.conf: LoadModule rewrite_module libexec/apache2/mod_rewrite.so LoadModule php5_module libexec/apache2/libphp5.so and still I am getting same permission denied error. Please help me I am trying to solve this problem for past 3 days but it is till unresolved.

    Read the article

  • Referral traffic not appearing properly in Google Analytics

    - by Crashalot
    We have a partnership arrangement with another site where we pay them for users sent to us. However, they claim our referral numbers for them are lower than theirs by 50%. They are tracking clicks in Google Analytics (using events) while we are using visits in Google Analytics. Are we doing something wrong with our Google Analytics installation? <!-- Google Analytics BEGIN --> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-12345678-1']); _gaq.push(['_setDomainName', 'example.com']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> <!-- Google Analytics END -->

    Read the article

  • Displaying topics/categories related ads

    - by Frank
    Is there any advertising company that allow webmasters to decide on general ad topics (such as "Entertainment", "Autos and vehicles", "Arts", etc.) to be displayed for a user visiting on a website? I know you can choose specific ad campaigns to show to users, or let the advertising company decide for you which ads to show. But I am looking for an option to ask for the advertsing company to show me ads based on categories/topics. Thanks. Frank

    Read the article

  • What I need to know if I want credit card payments in an ecommerce website

    - by Andriane
    I live in Costa Rica (central america). I want to build an ecommerce website with credit card payments, I know Asp.NET and SQL Server 2008. I know paypal and the express checkout solution, but many people (and clients) here doesnt like it or dont use it. Paypal and Authorize.Net dont support countries of Latin America, so if you can tell me one company who can or what can I do to setup my shopping cart, im studing right now security and how to implement SSL certificates and encrypt sensible data and PCI compliance in some way. I need this for my own framework in ASP.NET and provide ecommerce solutions here in my country.

    Read the article

  • Interpretation of empty User-agent

    - by Amit Agrawal
    How should I interpret a empty User-agent? I have some custom analytics code and that code has to analyze only human traffic. I have got a working list of User-agents denoting human traffic, and bot traffic, but the empty User-agent is proving to be problematic. And I am getting lots of traffic with empty user agent - 10%. Additionally - I have crafted the human traffic versus bot traffic user agent list by analyzing my current logs. As such I might be missing a lot of entries in there. Is there a well maintained list of user agents denoting bot traffic, OR the inverse a list of user agents denoting human traffic?

    Read the article

  • cPanel's Web Disk - Security issues?

    - by Tim Sparks
    I'm thinking of using Web Disk (built into the later versions of cPanel) to allow a Windows or Mac computer to map a network drive that is actually a folder on our website (above the public_html folder). We currently use an antiquated local server to store information, but it is only accessible from one location - we would like to be able to access it from other locations as well. I understand that folders above public_html are not accessible via http, but I want to know how secure is the access to these folders as a network drive? There is potentially sensitive information that we need to decide whether it is appropriate to store here. The map network drive option seems to work well as it behaves as if the files are on your own computer (i.e. you can open and save files without then having to upload them - as it happens automatically). We have used Dropbox for similar purposes, but space is a issue with them, as is accountability and so we haven't used it for sensitive information. Are there any notable security concerns with using Web Disk as a secure file server?

    Read the article

  • Ruby 1.9.3 segmentation fault on rackspace

    - by user531065
    I'm having a similar problem as "ruby 1.9 segmentation fault on exit." I followed the solution and updated libselinux with yum and have the following version installed: Package libselinux-2.0.96-6.fc14.1.x86_64 already installed and latest version I am running a RackSpace machine, kernel version: 2.6.35.4-rscloud. My C backtrace is: -- C level backtrace information ------------------------------------------- /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x18910a) [0x7f3cdb6c010a] vm_dump.c:796 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x5eb04) [0x7f3cdb595b04] error.c:258 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_bug+0xb8) [0x7f3cdb5969d8] error.c:277 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x119865) [0x7f3cdb650865] signal.c:609 /lib64/libpthread.so.0() [0x328540eeb0] /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x179272) [0x7f3cdb6b0272] ./include/ruby/ruby.h:1306 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x17f31b) [0x7f3cdb6b631b] vm.c:1220 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1800d5) [0x7f3cdb6b70d5] vm.c:624 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_yield+0x44) [0x7f3cdb6bb274] vm.c:654 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0xab821) [0x7f3cdb5e2821] numeric.c:3204 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x183601) [0x7f3cdb6ba601] vm_insnhelper.c:404 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1792b4) [0x7f3cdb6b02b4] insns.def:1015 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x17f31b) [0x7f3cdb6b631b] vm.c:1220 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1800d5) [0x7f3cdb6b70d5] vm.c:624 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_yield+0x44) [0x7f3cdb6bb274] vm.c:654 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_ary_each+0x54) [0x7f3cdb5622e4] array.c:1478 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x183601) [0x7f3cdb6ba601] vm_insnhelper.c:404 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1792b4) [0x7f3cdb6b02b4] insns.def:1015 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x17f31b) [0x7f3cdb6b631b] vm.c:1220 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_iseq_eval+0x1f0) [0x7f3cdb6bbae0] vm.c:1447 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x67bfd) [0x7f3cdb59ebfd] load.c:310 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_require_safe+0x6ef) [0x7f3cdb5a02df] load.c:619 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x183601) [0x7f3cdb6ba601] vm_insnhelper.c:404 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1792b4) [0x7f3cdb6b02b4] insns.def:1015 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x17f31b) [0x7f3cdb6b631b] vm.c:1220 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1800d5) [0x7f3cdb6b70d5] vm.c:624 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_yield+0x44) [0x7f3cdb6bb274] vm.c:654 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_ary_each+0x54) [0x7f3cdb5622e4] array.c:1478 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x183601) [0x7f3cdb6ba601] vm_insnhelper.c:404 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x1792b4) [0x7f3cdb6b02b4] insns.def:1015 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x17f31b) [0x7f3cdb6b631b] vm.c:1220 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(rb_iseq_eval_main+0xaf) [0x7f3cdb6bbbdf] vm.c:1461 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(+0x6471a) [0x7f3cdb59b71a] eval.c:204 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(ruby_exec_node+0x1d) [0x7f3cdb59c56d] eval.c:251 /usr/local/rvm/rubies/ruby-1.9.3-p0/lib/libruby.so.1.9(ruby_run_node+0x1e) [0x7f3cdb59e60e] eval.c:244 ruby() [0x4008eb] /lib64/libc.so.6(__libc_start_main+0xfd) [0x3284c1ee5d] ruby() [0x4007d9]

    Read the article

  • In Joomla In htaccess REQUEST_URI is always returning index.php instead of SEF URL

    - by Saumil
    I have installed joomsef version 3.9.9 with the Joomla 1.5.25. Now I want to set https for some of the section of my site(e.g URI starts with /events/) while wanting rest of all urls on http.I am setting rules in .htaccess file but not getting output as expected. I am checking REQUEST_URI of the SEF urls but always getting index.php as URI. Here is my htaccess code. ########## Begin - Custom redirects # # If you need to redirect some pages, or set a canonical non-www to # www redirect (or vice versa), place that code here. Ensure those # redirects use the correct RewriteRule syntax and the [R=301,L] flags. # ########## End - Custom redirects # Uncomment following line if your webserver's URL # is not directly related to physical file paths. # Update Your Joomla! Directory (just / for root) # RewriteBase / ########## Begin - Joomla! core SEF Section # RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] # # If the requested path and file is not /index.php and the request # has not already been internally rewritten to the index.php script RewriteCond %{REQUEST_URI} !^/index\.php # and the request is for root, or for an extensionless URL, or the # requested URL ends with one of the listed extensions RewriteCond %{REQUEST_URI} (/[^.]*|\.(php|html?|feed|pdf|raw))$ [NC] # and the requested path and file doesn't directly match a physical file RewriteCond %{REQUEST_FILENAME} !-f # and the requested path and file doesn't directly match a physical folder RewriteCond %{REQUEST_FILENAME} !-d # internally rewrite the request to the index.php script RewriteRule .* index.php [L] # ########## End - Joomla! core SEF Section # Here is my code e.g site url is http://mydomain.com/events RewriteCond %{REQUEST_URI} ^/(events)$ RewriteCond %{HTTPS} !ON RewriteRule (.*) https://%{REQUEST_HOST}%{REQUEST_URI}/$1 [L,R=301] I am not getting why REQUEST_URI is reffering index.php even though my url in address bar is like this http://mydomain.com/events . I am using JOOMSEF(Joomla extension for SEF URLS).If I am removing other rules from the htaccess file then joomla stops working. I am not getting a way to handle this as I am not expert.Please let me know if someone has passed through same situation and have solution or suggest some work around. Thanks

    Read the article

  • Advice for moving from custom domain hosted on blogspot to new custom domain on hosting provider [closed]

    - by Chan
    Need some advice :) a) Facts: 1. Currently, we have a blog www.thebigbigsky.net hosted on blogspot.com. The site has been up for around a year and google had cached some of the posts. 2. The idea for setting up the blog was to promote/share articles/write-up about personal development and self-growth, and also to offer counseling services related to them. 3. There is another subdomain - chinese.thebigbigsky.net hosting similar blog posts but in Chinese. This is hosted on blogspot as well. b) What we want to do: 1. We did some research on the internet and understood that, to make a website more popular, it is better to have a domain name that is related to the content of the site. Hence, in our case, a domain name containing "personal development" would be more ideal and easier for people to remember. 2. Hence, we are thinking to move away from blogspot to a meaningful new domain (example: personaldevelopmentwithxyz.com) and move all contents there. The new domain will be hosted on a hosting provider e.g. hostgator.com c) Here are the steps we have in mind: 1. Register the new domain (example: personaldevelopmentwithxyz.com) 2. Get a hosting provider hostgator.com 3. Setup a new site based on Wordpress, import all contents from existing blog to this new site - personaldevelopmentwithxyz.com. 4. Switch current blog back to thebigbigsky.blogspot.com 5. Switch DNS of www.thebigbigsky.net to point to the new site on hostgator.com 6. On hostgator.com, setup permanent redirect 301 of www.thebigbigsky.net to point to personaldevelopmetnwithxyz.com by following the tutorial mentioned here - http://www.blogbloke.com/migrating-redirecting-blogger-wordpress-htaccess-apache-best-method/ Questions: 1. Will it have major impact on the current google pagerank or SEO of thebigbigsky.net? As the site is not that popular, we assumed that the impact would not be a major one. 2. Is the domain personaldevelopmentwithxyz.com considered too long? (For SEO etc) 3. Steps c.4 and c.5 above - Will this work for our case? 4. How about the subdomain - chinese.thebigbigsky.net? We would like keep it as separate blog with the domain e.g. chinese.personaldevelopmentwithxyz.com 5. We added "facebook comments" to the existing post on www.thebigtbigsky.net and would not want to lose it. Will the comments still be there after we moved to the new domain? 6. Any better idea to do it? :) Thank you very much! regards, -chan

    Read the article

  • Robots.txt practices with .htaccess redirections (inherits)

    - by Jayhal
    I have a question regarding how to write robots.txt files for many domains and subdomains with redirects in place. We have a hosting account that enacts primary and add-on domains. All of our domains and subdomains, including the primary domain, is redirected via htaccess 301s to their own subdirectories in the primary domain's root directory. I'm confused about how I would write the robots.txt for certain directories. First, I wanted to confirm I am right in understanding that for domains and subdomains, crawlers will look to the directory that acts as that urls root directory for the crawling rules(robots.txt). Also, that a directory will not be affected by a robots.txt present in their parent directory if the directory has its own domain/subdomain, and that url is the one being accessed by crawlers. (Am pretty sure, but I wanted to confirm I didnt have a fundamentally flawed understanding of robots.txt) In the original root directory on the account(where the primary domain was directed before htaccess was put in place) what should the robots.txt contain? When crawlers look to crawl our primary domain, will they look to the original root directory for the robots.txt or will they reference the file contained in the new subdirectory where all the primary domain's site files are located? If so, what should the root's robot.txt include if anything at all. Would I be right to include a simple 'disallow: /' for all agents, and then include more specific robots.txt files in each subdirectory with more specific instructions. Would that affect the crawling of the directory where the primary domain is now redirected? Any help is greatly appreciated, Thanks!

    Read the article

  • dynamic urls and links on one web page

    - by John
    I am trying to figure out how to create dynamic links and urls on a static webpage. What I want to do is the following: I have a single webpage for example: MYWEBPAGEdotCOM/INDEX.HTML that will always look the same, except for one link on the page. the link would be on the page for example: LINK TO AFFILIATE: affiliatedotCOM/my-affiliate_code_here_DYNAMIC_REFERER the only thing would change is the "DYNAMIC_REFERER" with every dynamic url on this page: MYWEBPAGEdotCOM/INDEX.PHP_id=test1 MYWEBPAGEdotCOM/INDEX.PHP_id=test2 MYWEBPAGEdotCOM/INDEX.PHP_id=test3 MYWEBPAGEdotCOM/INDEX.PHP_id=test4 which would only hange the dynamic link on the page to: affiliatedotCOM/my-affiliate_code_here_test1 affiliatedotCOM/my-affiliate_code_here_test2 affiliatedotCOM/my-affiliate_code_here_test3 affiliatedotCOM/my-affiliate_code_here_test4 Can someone tell me how I could go about doing this? I just dont want to have to make 100's of pages, as this would prevent me from having to do so.

    Read the article

  • Multiple site URLs now showing up in SERPs

    - by Scott Helme
    Recently when you search for me on Google several URLs from my blog have started showing up, instead of just the main URL. 1) https://scotthelme.co.uk/ 2) https://scotthelme.co.uk/category/wifi-pineapple-2/? 3) https://scotthelme.co.uk/wifi-pineapple-karma-sslstrip/ Is there a reason this starts happening? I have contemplated doing something to remove them but is it better to occupy the top 3 spots instead of just the top spot? Should I just leave them there?

    Read the article

< Previous Page | 82 83 84 85 86 87 88 89 90 91 92 93  | Next Page >