Search Results

Search found 56530 results on 2262 pages for 'prasenjit niyogi@oracle com'.

Page 815/2262 | < Previous Page | 811 812 813 814 815 816 817 818 819 820 821 822  | Next Page >

  • 302 Redirect Issue for Joomla 2.5.7 version site

    - by DDD
    For my site i am using Joomla 2.5.7 version and FB comments tools for the articles in the site. i am getting the 302 redirect problem for the FB comments for the Articles to which i post. I have checked the url's here http://www.webconfs.com/http-header-check.php and got the following result with 302 redirect. for http://www.fijoo.com HTTP/1.1 302 Moved Temporarily = Date = Wed, 21 Nov 2012 09:46:39 GMT Server = Apache/2.2.22 (Unix) mod_ssl/2.2.22 OpenSSL/1.0.0-fips mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 mod_perl/2.0.6 Perl/v5.10.1 X-Powered-By = PHP/5.3.16 Set-Cookie = =en-GB; expires=Wed, 21-Nov-2012 10:46:40 GMT LOCATION = / Content-Length = 0 Connection = close Content-Type = text/html How to overcome this anyone please help.

    Read the article

  • is it ok to have 2 sitemaps on 1 website?

    - by user615041
    Do I have to have a sitemap page on my index page for bots to read it or can I just have it anywhere on my server? I have a phpbb/wordpress integration and I need 2 sitemaps mods for each one (or I need to have them somehow integrated together into one xml sitemap). Is this possible? Whats my best option? I would have the phpbb one something like this: http://www.example.com/phpbb/sitemap.html and the wordpress one something like this: http://www.example.com/wordpress/sitemap.html and then I would submit both off..but not have the links on my footer to confuse anyone.., the sitemaps would strictly be for search engines. Is this a good idea? what are you thoughts?

    Read the article

  • TypeScript for Visual Studio 2012

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2013/06/21/typescript-for-visual-studio-2012.aspxAt http://www.microsoft.com/en-us/download/details.aspx?id=34790, Microsoft provide a free download of TypeScript for Visual Studio 2012. The documentation site is at http://www.typescriptlang.org/It is described as TypeScript is a language for application-scale JavaScript development.TypeScript is a typed superset of JavaScript that compiles to plain JavaScript.Any browser. Any host. Any OS. Open Source.TypeScript starts from the syntax and semantics that millions of JavaScript developers know today.TypeScript compiles to clean, simple JavaScript code which runs on any browser, in Node.js, or in any other ES3-compatible environment.With TypeScript, you can use existing JavaScript code, incorporate popular JavaScript libraries, and be called from other JavaScript code.These features are available at development time for high-confidence application development, but are compiled into simple JavaScript.If you have written JavaScript, you will know why I welcome the release of version 0.9 of TypeScript as TypeScript should be a lot less frustrating to write. I suggest you go to https://typescript.codeplex.com/ and follow this very promising project.

    Read the article

  • Xubuntu 12.4 upgraded fine but there are everywhere pale grey windows/menus with white fonts!

    - by Costa
    I upgraded to Xubuntu 12.4 through the update manager. Everything works fine except that many windows and menus (Ubuntu Software Center,Transmission properties etc) apear in pale grey with white fonts. For examle, it's impossible to read n change properties on Transmission. Titles apear excellently, the rest is really hard to read: white text on allmost white background ... I have tried already:rebooting, changing theme/apearance photo1 http://en.zimagez.com/zimage/-26042012-102025.php When login as guest everything works fine. I created a new account, all is fine photo2 http://en.zimagez.com/zimage/-26042012-103258.php But I still want my initiall user-account which i had so well tuned...

    Read the article

  • How can I create multiple mini-sites with similar/duplicate content without hurting my search engine rank?

    - by ekpyrotic
    Essential background: I run a small company that lets members of the public post handwritten letters to their local politician (UK-based). Every week a number of early stage bills (called Early Day Motions) are submitted for debate in the House of Commons, and supporters of the motion will contact their local Members of Parliament, asking them to sign the motion. The crux: I want to target these EDMs with customised mini-sites, so when people search "EDM xxx", they find my customised mini-site, specifically targeting that EDM (i.e., "Send a handwritten letter to your MP asking them to sign EDM xxx"). At the moment, all these mini-sites (and my homepage) have duplicate content with only the relevant EDM name, number, and background image changed. (For example, http://mailmymp.com and http://mailmymp.com/edm/teaching-life-saving-skills-at-school-edm-550.php). The question: Firstly, will this hurt my potential search engine ranking? And, if so, what's the best way to target these political campaigns in an efficient manner without hurting my SEO prospects?

    Read the article

  • Latest EMEA Partner Success Stories (21st November)

    - by swalker
    Be Recognised Through Partner Success Stories You can showcase your capabilities in Oracle products and industries through partner success stories that are published on Oracle.com, and benefit from the traffic on our portal. To participate, you are required to complete a form and inform us of a successfully implemented project. If your story is selected, we will contact you for an interview. Click here to access the form and submit your success story. The latest customer success snapshots with partners being uploaded on to Oracle.com are: Robur S.p.A. Telenor LinkPlus A.S. Urals Power Engineering Company Bochemie Group Mediaset S.p.A Landbrokes Coeclerici S.p.A. IDS GmbH -Analysis and Reporting Services Teatre Nacional de Catalunya LinkPlus A.S. Scottish Water LH Dienstbekleidungs GmbH Champion Europe SpA  Metropolitan Housing Partnership McKesson Robur Learn more about our partner and customer successes by browsing the many EMEA Success Stories across all industries here.

    Read the article

  • Is it possible to track redirects to external sites from our subdomains?

    - by ChaBuku
    I have a handful of subdomains set up as redirects because we are using them for QR codes. I want to be able to track the QR code redirects (which are already set up and printed so no changing them at this point) and see the effectiveness of each. Here's two examples: http://qr.glorkianwarrior.com and http://ad.glorkianwarrior.com are set up to forward to our iTunes page (later on this year it may forward to Google Play or a specific landing page), is there any way on my server to track the redirect from the subdomain to iTunes and see where traffic is coming from first? I have the redirects set up through cPanel presently using subdomains. Edit: From the research I've seen I can't track a 301 directly. If I redirect to an internal page and then do a timed redirect to the iTunes link, how long will it take for the tracking script to track a hit?

    Read the article

  • Avoid penalties for duplicate (multilanguage) shared hosting

    - by Dave
    My concern is about SEO. Now let me explain the scenario. I am making a 3 languages website. The development is alright, but I was targeting local customers with one domain, and international (english version) with another. Eg: Local http://www.minhalojadesapatos.com.br (this is not the real website, just example!) Other http://www.myshoesstore.com.br Both domain point to exactly the same hosting and content, but when user comes through local domain, default language is set to portuguese, otherwise, default is english. Language handling on backend uses PHP Sessions and cookies, so with just a click users can change content language. How to avoid being SEO-penalised in this context? (yeah, I was hungry when focusing market for choosing two domains but the activity really needs that, it is a travel agency).

    Read the article

  • Does Windows 8 still support DirectX 9?

    - by SullY
    Is Windows 8 supporting DirectX 9? Because I was looking through some samples written in C++ and DirectX 9 made for Windows 8. It wasn't that, like I know it ( look here http://directxtutorial.com/Lesson.aspx?lessonid=111-4-2 ). E.g. Inizialising DirectX with COM: ComPtr<ID3D11Device1> dev; ComPtr<ID3D11DeviceContext1> devcon; It's just weird because I know it with the old way: ID3D11Device *dev; ID3D11DeviceContext *devcon; ( I hope you understand what I want to tell ) I hope it hasn't change completely due the released their new OS.

    Read the article

  • 301 redirect to 404 page?

    - by Kristian
    Currently i'm migrating www. prefix from my urls and use htaccess to do the job. Since we have new software and cleaned database some of the old urls doesnt exists anymore. Therefore some requests redirect to 404 page. 1. www.domain.com/old-page # htaccess redirect to non-www url, 301 2. domain.com/old-page # page does not exists, 404 Does this method have any SEO issues, or even affect pagerank? Or should i check the page existence before redirecting and show 404 without redirect?

    Read the article

  • Removing surrounding noises from voice recording

    - by Peak Reconstruction Wavelength
    I have a wave file whose frequency spectrum looks like this. http://i.stack.imgur.com/2rRaS.png It contains audio, which I want to keep while removing the rest. The problem is that the surround noise changes, just those distinct voice patterns remain. I marked the voice patterns for clarity: http://i.stack.imgur.com/eLkBl.png What could an algorithm look like / a workflow in adobe audition look like that removes everything but the voice patterns? I think that the main characteristic is the line-shaped form over time. Loudness alone is not enough as the noise is loud aswell.

    Read the article

  • Problem with NVIDIA G86 on Kubuntu 12.04

    - by Stefan
    I got problems some weeks ago with my NVIDIA G86 (8500 GT), supposedly due to the infamous 295.40 version of the driver. I got errror messges like NVRM: RmInitAdpterFailed. Tried varous sugestions about setting kernel acpi and memory options, but no luck. I pulled in x-swat and got 302.17, if I remember correctly. It did not help. People recommended xorg-edgers , so I pulled that in and got kernel 3.5.0.12 and nvidia 304.43 but the problem remained. Getting slightly panicked, I tried to back back to vanilla 12.04, so I purged nvidia* and located and removed anything on the system that smelled nvidia. I installed nouveau, cause people said it was great, but as it turns out, my card does not seem to be supported. :-( Sigh... So now I fear that I have atmessed up system, and graphics is terrible. Any help would be appreciated. Xorg.0.log: http://paste.ubuntu.com/1189616/ kern.log: http://paste.ubuntu.com/1189634/

    Read the article

  • Should package structure closely resemble class hierarchy?

    - by Panzercrisis
    Pretty simple question. Should package structure closely resemble class hierarchy? If so, how closely? Why or why not? For instance, let's say you've got class A and class B, plus class AFactory and class BFactory. You put class A and class B in the package com.something.elements, and you put AFactory and BFactory in com.something.elements.factories. AFactory and BFactory would be further down the hierarchy package-wise, but they'd be further up class-wise. Is this sort of thing a good idea or a bad idea?

    Read the article

  • problem showing my website correctly in search engines

    - by dinbrca
    Hello guys, I have a website which i have indexed on google for example (like 15 days ago). some of my pages pass arguments like: http://www.bla.com/products.php?pro=bla&page=view suddently i saw that passing arguments like this isn't good for SEO purposes and started using htaccess rewrite. and changed the arguments to like this: http://www.bla.com/products/bla/*view*/ now my site on google still shows as i showed at link number 1 what should i do? i thought i should wait for the search engine to crawl my site again but nothing happened. thanks in advanced, Din

    Read the article

  • How to enable hibernate on Ubuntu 13.10?

    - by mjwittering
    Something I usually do after installing Ubuntu is reactivate the hibernation function. I find it quite useful to function for the more energy concious. Typically, I'd complete the instructions in this tutorial for Ubuntu 12.04 and I'd be back in business. However, it does seem to work any more. Any suggestions? http://www.howtogeek.com/113923/how-to-re-enable-hibernate-in-ubuntu-12.04/ sudo gedit /etc/polkit-1/localauthority/50-local.d/com.ubuntu.enable-hibernate.pkla [Enable Hibernate] Identity=unix-user:* Action=org.freedesktop.upower.hibernate ResultActive=yes

    Read the article

  • Using Mod_Rewrite To Block Referrer Based On Domain Extenstion?

    - by Matt
    I've been in web development for several years now (I'm a student web designer), and recently, I've begun to experiment with mod_rewrite for things like URL shortening. I was wondering, is it possible to block a referrer by domain extension, instead of just by full site, etc.? So, instead of RewriteEngine on # Options +FollowSymlinks RewriteCond %{HTTP_REFERER} examplesite\.com [NC] RewriteRule .* - [F] could you do RewriteEngine on # Options +FollowSymlinks RewriteCond %{HTTP_REFERER} \.com [NC] RewriteRule .* - [F] without the full domain name? Thanks. I'm fairly knowledgeable about other web dev / hosting topics, but mod_rewrite is new to me and Google wasn't helping.

    Read the article

  • SharePoint Search Problem: The start address sps3://server cannot be crawled.

    - by Clara Oscura
    With this post, I'm going to start a series on problems I have encountered with SharePoint search. Error: The start address sps3://luapp105 cannot be crawled. Context: Application 'Search_Service_Application', Catalog 'Portal_Content' Details:  Access is denied. Verify that either the Default Content Access Account has access to this repository, or add a crawl rule to crawl this repository. If the repository being crawled is a SharePoint repository, verify that the account you are using has "Full Read" permissions on the SharePoint Web Application being crawled.   (0x80041205) (Event ID: 14, Task Category: Gatherer) Solution: give appropriate permissions to User Profile Synchronisation Service http://social.technet.microsoft.com/Forums/en-US/sharepoint2010setup/thread/64cdf879-f01e-4595-bc52-15975fefd18d http://www.dotnetmafia.com/blogs/dotnettipoftheday/archive/2010/03/29/how-to-set-up-people-search-in-sharepoint-2010.aspx

    Read the article

  • How do I install OpenStack?

    - by csgeek
    Supposedly openstack can be installed easily under Ubuntu 12.04 LTS. I've installed 32 and 64bit versions of Ubuntu Server with the same behavior. sudo tasksel check OpenStack hit Okay then I get a tasksel: aptitude failed (100) I've seen: http://www.hastexo.com/resources/docs/installing-openstack-essex-20121-ubuntu-1204-precise-pangolin and https://github.com/EmilienM/doc-openstack documentation, but I was hoping that since it was an LTS released and it was an option in tasksel that I was simply overlooking something obvious and it's just a matter of selecting the right checkbox and hitting okay. Too much wishful thinking?

    Read the article

  • What measures can be taken to make sure Google is aware of the existence of a newly created page?

    - by knorv
    Consider a website with a large number of pages. New pages are published regularly. When publishing a new page the website operator wants to get the newly created paged indexed in Google as soon as possible. The website operator wants to minimize the time spent between publication and indexing. Consider the site http://www.example.com/ with hundreds of thousands of pages. The page page http://www.example.com/something/important-page.html is created at say 12:00. I want to get important-page.html indexed as soon as possible after 12:00. Ideally within seconds or minutes. What options are available to try to get Google to index a specific newly created page as soon as possible?

    Read the article

  • Farseer Physics: Ways to create a Body?

    - by EdgarT
    I want to create something similar to this using farsser and Kinect: https://vimeo.com/33500649 This is my implementation until now: http://www.youtube.com/watch?v=GlIvJRhco4U I have the outline vertices and the triangulation of the user. And following the Texture to Polygonmsample i used this line to create the shape, where farseerObject is a list of vertices of the triangles: _compound = BodyFactory.CreateCompoundPolygon(World, farseerObject, 1f, BodyType.Dynamic); But I have to update the body each frame (like 30 fps) and this is very slow. I get just 2 or 3 fps. There's another (faster) way to create the Body from a list of triangles or the contour vertices?

    Read the article

  • How to test 3d acceleration?

    - by HappyDeveloper
    I want to install and test 3d acceleration in Ubuntu 12. I have read these pages: https://help.ubuntu.com/community/RadeonDriver https://wiki.ubuntu.com/X/Troubleshooting/VideoDriverDetection?action=show&redirect=X%2FTroubleshooting%2FFglrxInteferesWithRadeonDriver#Problem:_Need_to_purge_-fglrx I think I have installed it correctly, but I don't know how to test it. I tried to play minecraft in the browser, but I got a black screen. It may be a java problem too, so I need to troubleshoot. So how can I test my video card and drivers?

    Read the article

  • Meta tags error 500 on Facebook wordpress [migrated]

    - by La Clandestina
    Lets see, I changed the theme on the lasts days and haven't published anything till now. but now i have an 500 error on the Fb debugger https://developers.facebook.com/tools/debug/og/object?q=http%3A%2F%2Fquitoxic.com%2Funpocodesur%2Fmachupicchu-for-cheap%2F So facebook sharing is no pulling anything not even the title. I tried uninstalling every plugin could had problems with the metatags but nothing worked, I reinstalled one of them and still not working, When i look on the code everything looks fine but im not an expert, can anyone tell me what is wrong and how the hell can i fix it?

    Read the article

  • Tales from the Coal Face - Reporting errors

    - by TATWORTH
    One of the questions that comes up frequently, is "Is it worthwhile to report errors?".Last weekend, after installing the latest StyleCop I loaded up my copy of Power Collections. I found that StyleCop was now correctly picking up a lot of missing "this." statements, however there were now a number of false positives. Anticipating the need to submit sample code, I cleaned the solution and zipped it up.I reported this at http://stylecop.codeplex.com/discussions/357319.  The stylecop administrator promoted this report to a work item (see http://stylecop.codeplex.com/workitem/7285) and I uploaded the previously prepared Zip file. The StyleCop team was able to locate the problem and it is "Fixed in upcoming 4.7.27".The conclusion:Report errors!  Prepare sample code illustrating the error.

    Read the article

  • No sound in any web browser(s)

    - by shaneo
    Hello I recently tried to compile and update alsa from source via this guide http://www.stchman.com/alsa_update.html. Afterwards the sound in any web browser I open Firefox, Opera, Chrome, Chromium, Iron there is no sound on any pages. I went back through the script listed on the site and found where it had installed the drivers and deleted it and than re-installed alsa via synaptic. Though I still have no sound in my browser(s). All system sounds work as they are supposed to only web sounds don't work. Here is my alsabase.conf http://paste.ubuntu.com/1073135/ also a snapshot of alsamixer Any assistance would be greatly appreciated. Thank You and let me know if any more information is required.

    Read the article

  • Programming Entity Framework, 2nd Edition (EF4) Table of Contents

    We are closing in on finalizing the 2nd edition of Programming Entity Framework! Although the rough draft chapters are already available through Safari’s Rough Cuts program (here: http://oreilly.com/catalog/9780596807252) I have been editing and reshaping the content since those chapters were published. You can get the final print edition (August 15th or perhaps a bit earlier) at O’Reilly or pre-order it here on Amazon.com (here) (and elsewhere of course!) I believe that the book will...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

< Previous Page | 811 812 813 814 815 816 817 818 819 820 821 822  | Next Page >