Search Results

Search found 21197 results on 848 pages for 'webcenter content'.

Page 437/848 | < Previous Page | 433 434 435 436 437 438 439 440 441 442 443 444  | Next Page >

  • How can I read the sourcecode of a PDF

    - by Fendrix
    I want get the sourcecode of a PDF File. Unfortunately once I open the PDF with a texteditor some lines doesn't make sense... just like.... %PDF-1.6 %âãÏÓ 3 0 obj <</Ff 0/F 6/Type/Annot/Subtype/Widget/DR<</Font<</Helv 2 0 R>>>>/T(Ä\n¬4^XÈ4ýæçO§W²W^D³^Ywzº<92>õÌ^AÀÄi]â<96><8c>)/V(Ä\n¬4^XÈ4ýæçO§W²W^Dø<93>r^D¥à<82>ú<83>Z^Q7^Cv^FÈ)/AP<</N 1 0 R>>/P 4 0 R/BS<</W 1/S/S>>/FT/Tx/Rect[40 50 70 80]/DA(Ä\n¬4^XÈ4ýæçO§W²W^Dù~êw3<84>&^X´âL|q@³^VC<8a>"Ýo^N¿=Ì<91>ta^R`àz)>> endobj 6 0 obj so %PDF-1.6 is fine but after it s not resolving the correct letters.... I tried with vim ... is there any chance to get the correct content ....

    Read the article

  • Is it possible to filter analytics to particular visits like you can filter to particular dates?

    - by andy
    Is it possible to find out more information about particular visits in analytics? For example, say I'm looking at new versus returning users. I then add a secondary column of "city". Ok, now I know all new users from yesterday came from new york, for example. But what if I want to find out more information about those particular new vists from new york. Such as behaviors, technology, content. Is it possible to filter analytics to particular visits like you can filter to particular dates?

    Read the article

  • Is there any way to discover the traffic of a site I don't control?

    - by George Bailey
    Given the following: The website does not call any external images or scripts, all the content is hosted on a server that is in our control. The website does not contain the meta tag, nor does it contain the html file that would authorize a Google Account access to Webmaster Tools. The access logs have not been provided to any 2nd or 3rd party. Is it possible for a 3rd party to get an idea of how many hits the site is getting, or are they limited to just seeing how high the site ranks? How could the 3rd party determine how well the site is doing under these restrictions? Is there a website for that that you know of?

    Read the article

  • Joomla 2.5 -- Adding a custom field to menu-item-edit-form

    - by philipp
    I would like to add a new Field (Select list of all menu-items) to the menu item-edit form. To do so I was setting up an system plugin with the following directory structure: languageroot/languageroot.php languageroot/form/form.xml As you can see in the posted code, that is all very basic to try out. Only after adding the following lines: <li <?php echo $this-form-getLabel( 'langroot-text', 'main' )? <?php echo $this-form-getInput('langroot-text', 'main' ); ? </li to: /admininstrator/components/com_menus/views/item/tmpl/edit.php a textfield shows up. Is it possible to inject the field without touching the edit.php? Is there anywhere a good tutorial about the JForm api? Is a system-plugin the right kind, or could it be a content plugin, or should it even be a component?

    Read the article

  • How would I pursue a track in front-end web development?

    - by Koviko
    I've recently been put on heavy JavaScript projects and have become fond of the front-end world in comparison to the back-end. I have always been good at proper markup and CSS, and coupled with AJAX, pretty animations, and dynamically generated content, it's become a much more interesting and flashy world for me. I would like to be able to continue to hone my craft in the same way that I was able to become proficient at back-end development with PHP: getting paid to do it. How would I market myself as a front-end web developer with a strong interest in dynamic JavaScript-driven websites? Due to my strong background in back-end development, how would I find the companies that wouldn't waste my front-end skill set on simple HTML/CSS development? And as a bonus, how would I apply this to being a contractor/freelance developer rather than a salaried employee? While I like the idea of being able to remain a part of my creations, I also dislike the maintenance phase of projects.

    Read the article

  • DBA Best Practices: A Blog Series

    - by Argenis
      Introduction After the success of the “Demystifying DBA Best Practices” Pre-Conference that my good friend Robert Davis, a.k.a. SQLSoldier [Blog|Twitter] and I delivered at multiple events, including the PASS Summit 2012, I have decided to blog about some of the topics discussed at the Pre-Con. My thanks go to Robert for agreeing to share this content with the larger SQL Server community. This will be a rather lengthy blog series - and as in the Pre-Con, I expect a lot of interaction and feedback. Make sure you throw in your two cents in the comments section of every blog post. First topic that I’ll be discussing in this blog series: The thing of utmost importance for any Database Administrator: the data. Let’s discuss the importance of backups and a solid restore strategy. Care to share your thoughts on this subject in the comments section below?

    Read the article

  • Is it safe to Block These URLs with Robots.txt?

    - by Edgar Quintero
    I have a website that has all URLs optimized and 301 redirected from nasty URLs to clean ones. However, everywhere throughout the site the unclean URLs are linked in menus, content, products, etc. Google currently has all clean URLs indexed, along with a few unclean URLs too. So the site still has linked everywhere the old URLs (ideally this wouldn't be the case but this is how it is ATM). I would like to block the unclean URLs with robots.txt. The question: If I block these unclean URLs with the robots.txt, when the entire website is linked with them (but they all redirect to the clean version), will this affect the indexing status at all?

    Read the article

  • SPARC Servers at Oracle OpenWorld

    - by B.Koch
    There is plenty to learn about the SPARC servers at the Oracle OpenWorld. The SPARC server sessions offer depth and breadth in content selection to satisfy everyone's need from the one who is technically oriented to the one who would like to understand more about the business value of SPARC technology. And, there is always more. Keynotes, Oracle innovations and many product demonstrations are only a few of many other opportunities to interact with the product experts and executives to establish greater insight to the Oracle SPARC technology. The Oracle SPARC Servers Oracle's SPARC servers running Oracle Solaris are ideal for mission-critical applications that require high performance, best-in-class availability, and unmatched scalability on all application tiers. With a robust roadmap, Oracle assures the highest levels of investment protection through 100% SPARC/Solaris binary compatibility, proven by hundreds of thousands of deployments over more than 20 year. 

    Read the article

  • Programatically loading user controls

    - by PhilSando
    Today's little problem is that I am trying to load user controls from my codebehind like so: Dim myControl As UserControl = Page.LoadControl("~\Modules\Content.ascx")              Controls.Add(myControl)  On running the page myControl is no where to be seen. I wonder why that is? Well after a bit of thought the following come to mind... Am I using the correct code to insert the usercontrol? Is there an alternative available? Does the fact that the usercontrol has a page_load method make a difference? Does the fact that the usercontrol is being called from the page_init method make a difference? Do I need to register the control in my aspx page at design time? I'll be looking to answer these questions as the day goes on!

    Read the article

  • Ensure Future Browser Compatibility - SharePoint Branding

    - by KunaalKapoor
    HTML and Future Internet Explorer Compatibility with SharePointAs new versions of Internet Explorer are released, the way HTML is rendered by the browser could change over time. To address the possibility of changes, Microsoft uses the X-UA-Compatible META tag that targets HTML markup to a specific version of Internet Explorer. The default SharePoint 2010 master pages are set to force current and future versions of Internet Explorer to render HTML in Internet Explorer 8 mode like the following markup:<meta http-equiv="X-UA-Compatibile" content="IE=IE8" />The Adventure Works Travel HTML includes the META tag to help ensure future Internet Explorer versions will display the SharePoint HTML properly.For more information about the Internet Explorer Standards Mode, see META Tags and Locking in Future Compatibility.

    Read the article

  • How ad retargeting works?

    - by Bojan Babic
    Recently, I read that Facebook ads are moving towards retargeting and got interested deeper into subject. Essentially, retargeting is technique advertisers use that tracks purchase intent by putting cookies into your browser and when you visit another website within ad network you will see ad for item you "wanted to buy". In order this to for, both publisher and advertiser need to work together. Publisher needs add couple of lines of javascript and publisher need to be able to read this info across sites. In most cases, javascript inserts iframe from adnetwork domain. Iframe script sets cookies for both host domain and remote adnetwork domain. However, Same Origin policy do not let iframes read/set content from parent domain. Is there something I'm missing here? Can someone explain how this technique actually works?

    Read the article

  • Extracting meta tags attribute using wget [migrated]

    - by Amit
    I have a file having some URLs per line. I need to extract the "keywords" present in the tags i.e. if there is meta tag for "keywords" then i want to get "content" value for it. Example: if the web-page has this meta-tag then for that URL i want "wikipedia,encyclopedia" to be extracted. One approach is to download the web-page using "wget" and then parse it using some standard HTML parser. I was wondering is there any better way to do this without downloading the entire web-page.

    Read the article

  • Is there a recommended order to take the Oracle Java EE certification exams?

    - by Karl
    I recently passed the Oracle Certified Professional, Java SE 6 Programmer examination. Now, my boss would like me to take "the next step" to broaden my competence. I tried to explain that there is no equivalent Java EE 6 Programmer examination, but a number of different exams, such as Web Services, Web Components, and Enterprise JavaBeans. Is there a recommended path to follow for the various Oracle certifications in the Enterprise Edition of Java? Is it logical to take some exams prior to taking others because the content builds upon previous knowledge or are they all independent?

    Read the article

  • Is there any good reason I would want my website to be framed?

    - by minitech
    I'm building a website that's not security-critical in any way at all, so having somebody put a page in an <iframe> is not particularly dangerous to its users. However, as my website doesn't have script plugins that will be used anywhere else, is there any reason why I shouldn't just apply: X-Frame-Options: Deny to every page on my website? Is there any valid reason for any other website to embed mine? I've seen plenty of content-stealing ones and attempts to hijack user accounts, but never an actual good usage of frames that's not an explicit feature of the website.

    Read the article

  • Google Webmaster Tools Index dropped to Zero [closed]

    - by Brian Anderson
    Earlier this year I rebuilt my website using ZenCart. Immediately I saw a drop in index status from 59 to 0. I then signed up for Google Webmaster Tools and noticed the Index status took a dramatic drop and has never recovered. I have worked to add content and I know I am not done, but have not seen any recovery of this index since. What confuses me is when I look at the sitemap status under Optimization it shows me there are 1239 submitted and 1127 pages indexed. Most of my pages have fallen off page one for relevant search terms and some are as far back as page 7 or 8 where they used to be on the first page. I have made some changes in the past week to robots.txt and sitemap.xml, but have not seen any improvements. Can anyone tell me what might be going on here? My website is andersonpens.net. Thanks! Brian

    Read the article

  • Last chance to see ... Virtualisation for Developers at NxtGenUG Cambridge, Tuesday 14th December

    - by Liam Westley
    As a farewell to 2010 I'm also saying farewell to presenting my Virtualisation for Developers and Hyper-V for Developers presentations with a final outing at NxtGenUG in Cambridge (my first visit to a user group in The Fens). I may have some homemade nibbles and party stuff to liven up the evening, and a certain Rachel Hawley has suggested a santa hat might be appropriate too. It's going to be a fun night. Sign up details are available here,   http://www.nxtgenug.net/ViewEvent.aspx?EventID=353 And for those of you who can't make this last outing, I am planning on converting both presentations into a series of blog posts so the content will be available to a wider audience.  If the posts don't seem to be appearing fast enough drop me an e-mail to remind me to get on with it !

    Read the article

  • What is good side of PageRank?

    - by SharkTheDark
    I am doing research about backlinks/PR/SEO/Search Result position and all I read about is that PageRank is not important, that it worth before but now it's not important at all. Only thing I found useful about it that is "change Search Result position", but ONLY if there are two sites with same keywords and same text content value, then Search Engine will check which site has higher PR and place that site above lowest one. Google counts PR importance as 20% for displaying search rankings, and Yahoo! is like 3%... Correct me if I am wrong... Is there any other good thing from it?

    Read the article

  • why google ignore my links page?

    - by Yaniv
    I have a website, where im loading all the data via AJAX. since, google doesn't work with AJAX, and the ways to make it AJAX-friendly are a bit odd, i thought that by creating a links page, where it links, from server side, to all the links that im loading in ajax - will solve the problem. but unfortunately, that doesnt seem to work. google webmaster shows that even though my links page discovered, the content of it - the links - are totally ignored. I can only assume that google tend to ignore links in such pages. my question is - WHY?! and furthermore, how to overcome this. Thanks.

    Read the article

  • How can I access profile fields with a % variable in Drupal Actions?

    - by Rob Mosher
    I have an action setup in drupal to e-mail me when a new user registers for the site. Right now it is only telling me their user name (%username). Is there a variable that can access added fields so I can get their real name (First Last), or another way to add this info to the action message? So instead of my new user action having a message like: "%username created an account" - "jschmoe created and account" I could have: "%first_name %last_name (%username) created an account" - "Joe Schmoe (jschmoe) created an account". I'm using Content Profile module for the first and last name fields, though have few enough users at the moment that I could switch to Profile module fields.

    Read the article

  • Why Ubuntu is not booting anymore from USB live?

    - by xRobot
    I have just purchased a brand new laptop Samsung 300e5c with Windows 7. So I have reduce the windows partition and I have installed ubuntu 12.04 from my USB pen drive. Then I have tried to boot again from USB pen drive but it doesn't work anymore. I have tryed to boot on others laptop from my USB pen drive and it does work perfectly. but on my laptop, NO :(. I have set all USB drive as first boot device in BIOS. I have tryed on each USB port of my laptop but it doesn't work. Only the first time it did work when I installed ubuntu and now it doesn't work anymore. It's very strange I can see the content of the usb pen drive on my laptop and on others laptop without problems. why it doesn't work anymore ?

    Read the article

  • Google indexed site's address by accident. What do I do now?

    - by AndrejaKo
    I was making a site for a friend of mine and he wanted to be able to see my progress as I worked on the site, so I decided to put the site on a server on my computer and enable access by a domain name registered to me. It turns out that I forgot to set up a robots.txt file for the site and somehow Google indexed the site. My question is: What do I do now? As I understand it, Google doesn't like duplicate content and my friend could have problems when I upload the new site to his server. Right now his current site, which only has a work in progress page, is first on Google when searching for relevant keywords and I really really don't want to damage that. Is there anything else I need to be concerned about?

    Read the article

  • Keeping files private on the internet (.htaccess password or software/php/wordpress password)

    - by jiewmeng
    I was asked a while ago to setup a server such that only authenticated users can access files. It was like a test server for clients to view WIP sites. More recently, I want to do something similar for some of my files. Tho they are not very confidential, I wish that I am the only one viewing it. I thought of doing the same, Create a robots.txt User-agent: * Disallow: / Setup some password protection, .htpasswd seems like a very ugly way to do it. It will prompt me even when I log into FTP. I wonder if software method like password protected posts in Wordpress will do the trick of locking out the public and hiding content from Search Engines? Or some self made PHP script will do the trick?

    Read the article

  • Avoiding lag when rendering Texture2D for first time

    - by Emir Lima
    I have found a similar question here, but it is about playing sounds. I am using 2048 x 2048 textures for sprite sheets and every time I call spriteBatch.Draw using a sheet for the first time in game execution, causes a considerable lag. The lag doesn't appears for the next times. Someone has faced this problem before? What can I do to overcome this? Update: I inserted a code in the end of content load routine that draws EVERY Texture2D that is loaded into ContentManager before follow to the game screen. This works well. None lag occurs when different textures are rendered over the time, EXCEPT if the IsFullScreen are changed. Apparently, changing this property makes the textures loaded in the GPU gone. Is that correct?

    Read the article

  • SSMS Tools Pack 1.9.4 is out! Now with SQL Server 2011 (Denali) CTP1 support.

    - by Mladen Prajdic
    To end the year on a good note this release adds support for SQL Server 2011 (Denali) CTP1 and fixes a few bugs. Because of the new SSMS shell in SQL 2011 CTP1 the SSMS Tools Pack 1.9.4 doesn't have regions and debug sections functionality for now. The fixed bugs are: A bug that prevented to create insert statements for a database A bug that didn't script commas as decimal points correctly for non US settings. A bug with searching through grid results. A threading bug that sometimes happened when saving Window Content History. A bug with Window Connection Coloring throwing an error on startup if a server colors was undefined. A bug with changing shortcuts in SSMS for various features. You can download the new version 1.9.4 here. Enjoy it!

    Read the article

  • Windows 2008 R2 and Windows 7 SP1 released to manufacturers

    - by Ryan Roussel
    SP1 went RTM today which means it will be widely available soon. According  to the Microsoft server division blog, the service pack will be available to Microsoft's TechNet and MSDN subscribers, as well as to Microsoft volume licensing customers, on Feb. 16, 2011. It will be generally available via the Microsoft Download Center and Windows Update on Feb. 22   You can see the blog and news here: http://blogs.technet.com/b/windowsserver/archive/2011/02/08/windows-server-2008-r2-and-windows-7-sp1-releases-to-manufacturing-today.aspx   New features in SP1 include dynamic memory for Hyper-V VMs and RemoteFX which enables rich desktop content for Virtual Desktop Infrastructure.

    Read the article

< Previous Page | 433 434 435 436 437 438 439 440 441 442 443 444  | Next Page >