Search Results

Search found 24641 results on 986 pages for 'content provider'.

Page 2/986 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • What was missing from the Content Strategy Forum?

    - by Roger Hart
    In April, Paris hosted the first ever Content Strategy Forum. The event's website proudly proclaims: 170 attendees, 18 nationalities, 17 speakers, 1 volcano... Content Strategy Forum 2010 rocked the world! The volcano was in Iceland, and the closest we came to rocking the world was a cursory mention in the Huffington Post, but I'll grant the event was awesome. One thing missing from that list, however, is "94 companies" (Plus a couple of universities and freelancers, and what have you). A glance through the attendees directory reveals a fairly wide organisational turnout - 24 students from two Parisian universities, countless design and marketing agencies, a series of tech firms, small and large. Two delegates from IBM, two from ARM, an appearance from RIM, Skype, and Facebook; twelve from the various bits of eBay. Oh, and, err, nobody from Google, Microsoft, Yahoo, Amazon, Play, Twitter, LinkedIn, Craigslist, the BBC, no banks I noticed, and I didn't spot a newspaper. You get the idea. Facebook notwithstanding, you have to scroll through a few pages to Alexa rankings to find company names from the attendee list. I find this interesting, and I'm not wholly sure what to make of it. Of the large, web-centric, content-rich organizations conspicuously absent, at least one of two things is true: They didn't know about the event They didn't care about the event Maybe these guys all have content strategy completely sorted, and it's an utterly naturalised part of their business process. Maybe nobody at say, Apple or Play.com ever publishes a single piece of content that isn't neatly tailored to their (clearly defined, of course) user and business goals. Wouldn't that be lovely? The thing is, in that rosy and beatific world, there's still a case for those folks to join the community. There are bound to be other perspectives, and things to learn. You see, the other thing achingly conspicuous by its absence was case studies. In her keynote address, Kristina Halvorson made the point that what content strategy really needs is some big, loud success stories. A point I'd firmly second as a content strategist working within an organisation. Sarah Cancilla's presentation on content strategy at Facebook included some very neat, specific examples, and was richer for it. It didn't hurt that the example was Facebook - you're getting impressively big numbers off base. What about the other big boys? Is there anybody out there with a perspective? Do we all just look very silly to you, fretting away over text and images and users and purposes? Is content validation and maintenance so accustomed a part of your business that calling attention to it is like sniffing the air and saying "Hmm, a lot of nitrogen about today."? And if it is, do you have any wisdom to share?

    Read the article

  • What is 'lack of original content'?

    - by JVerstry
    It is written everywhere that lack of original content is has a negative impact on ranking. But what is lack of original content? (I am not talking about duplicate content) I guess if you copy other site's content, this makes sense. But, assuming one develops its own functionalities, but similar functionalities are already available on other sites, is this considered lack of original content? Can Google decide to not index such pages (i.e., not give them a chance at all)? Are there other definition of 'lack of original content'?

    Read the article

  • Is content slowing down your business?

    - by Lance Shaw
    We are living in a digital world, however paper is everywhere and expensive, right? We all agree content is an important part of our organization and contribute to its decision making. However many of us see dealing with this as a challenge and the growth of content is impacting our ability to scale and respond quickly to our customers. Business always has been content intensive. For JD Edwards customers, this is an important consideration.  After all, the processes being run in JD Edwards are usually very critical to the success of your business and if they are not running as smoothly as they should due to manual process steps involving paper or searching for content, you should look into improving them.  To that end, we hope you will join this webinar and learn how Oracle and KPIT | SYSTIME have partnered to help a JD Edwards customer content-enable its enterprise with Oracle WebCenter Content and Oracle WebCenter Imaging 11g and integrate them back with JD Edwards to significantly improve processing speed and operational costs.

    Read the article

  • DotNetOpenAuth OpenID Provider "Sequence contains more than one element"

    - by Matthew Johnson
    Hello, all, I'm having trouble implementing my OpenID provider with DNOA 3.4.3. Everything was going absolutely peachy until I needed AX support as well. I set AXFetchAsSregTransform in the web config, as recommended by Andrew at http://groups.google.com/group/dotnetopenid/browse_thread/thread/5629a24c0a7e8d99. Doing this caused me to get the exception "Sequence Contains More Than One Element" on my decide.aspx page, however, and I haven't been able to get past it. The following line is throwing the exception: Edit: Strangely enough, this is not the line throwing the error anymore. The SendResponse() is now triggering the exception ClaimsRequest requestedFields = ProviderEndpoint.PendingRequest.GetExtension(); ProviderEndpoint.SendResponse() Any thoughts on why this may be? Any help would be greatly appreciated! The logs leading up to the error are as follows: 2010-04-28 12:38:20,247 (GMT-7) [5] INFO DotNetOpenAuth.Messaging.Channel - Scanning incoming request for messages: https://myprovider/provider.ashx?openid.ns=http%3A%2F%2Fspecs.openid.net%2Fauth%2F2.0&openid.claimed_id=http%3A%2F%2Fspecs.openid.net%2Fauth%2F2.0%2Fidentifier_select&openid.identity=http%3A%2F%2Fspecs.openid.net%2Fauth%2F2.0%2Fidentifier_select&openid.mode=checkid_setup&openid.ns.ext1=http%3A%2F%2Fopenid.net%2Fsrv%2Fax%2F1.0&openid.ext1.mode=fetch_request&openid.ext1.type.email=http%3A%2F%2Faxschema.org%2Fcontact%2Femail&openid.ext1.type.fullname=http%3A%2F%2Faxschema.org%2FnamePerson&openid.ext1.type.language=http%3A%2F%2Faxschema.org%2Fpref%2Flanguage&openid.ext1.required=email&openid.return_to=http%3A%2F%2Fmyrelyingparty%2Flogin.jsp%3Foidreturn%3D%252Fhome&openid.assoc_handle=%7B634080802953194640%7D%7BHxjFNw==%7D%7B20%7D&openid.realm=http%3A%2F%2Fmyrelyingparty 2010-04-28 12:38:20,285 (GMT-7) [5] INFO DotNetOpenAuth.Messaging.Channel - Processing incoming CheckIdRequest (2.0) message: openid.claimed_id: http://specs.openid.net/auth/2.0/identifier_select openid.identity: http://specs.openid.net/auth/2.0/identifier_select openid.assoc_handle: {634080802953194640}{HxjFNw==}{20} openid.return_to: http://myrelyingparty/login.jsp?oidreturn=%2Fhome openid.realm: http://myrelyingparty/ openid.mode: checkid_setup openid.ns: http://specs.openid.net/auth/2.0 openid.ns.ext1: http://openid.net/srv/ax/1.0 openid.ext1.mode: fetch_request openid.ext1.type.email: http://axschema.org/contact/email openid.ext1.type.fullname: http://axschema.org/namePerson openid.ext1.type.language: http://axschema.org/pref/language openid.ext1.required: email 2010-04-28 12:38:22,773 (GMT-7) [14] INFO DotNetOpenAuth.Messaging.Channel - Scanning incoming request for messages: https://myprovider/login.aspx?ReturnUrl=%2fdecide.aspx 2010-04-28 12:38:36,167 (GMT-7) [5] INFO DotNetOpenAuth.Messaging.Channel - Scanning incoming request for messages: https://myprovider/login.aspx?ReturnUrl=%2fdecide.aspx 2010-04-28 12:38:38,147 (GMT-7) [14] ERROR DotNetOpenAuth.Messaging - Protocol error: An HTTP request to the realm URL (http://myrelyingparty/) resulted in a redirect, which is not allowed during relying party discovery. at DotNetOpenAuth.Messaging.ErrorUtilities.VerifyProtocol(Boolean condition, String message, Object[] args) at DotNetOpenAuth.OpenId.Realm.Discover(IDirectWebRequestHandler requestHandler, Boolean allowRedirects) at DotNetOpenAuth.OpenId.Realm.DiscoverReturnToEndpoints(IDirectWebRequestHandler requestHandler, Boolean allowRedirects) at DotNetOpenAuth.OpenId.Provider.HostProcessedRequest.IsReturnUrlDiscoverableCore(OpenIdProvider provider) at DotNetOpenAuth.OpenId.Provider.HostProcessedRequest.IsReturnUrlDiscoverable(OpenIdProvider provider) at OpenIdProviderWebForms.decide.Page_Load(Object src, EventArgs e) at System.Web.Util.CalliHelper.EventArgFunctionCaller(IntPtr fp, Object o, Object t, EventArgs e) at System.Web.Util.CalliEventHandlerDelegateProxy.Callback(Object sender, EventArgs e) at System.Web.UI.Control.OnLoad(EventArgs e) at System.Web.UI.Control.LoadRecursive() at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) at System.Web.UI.Page.ProcessRequest(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) at System.Web.UI.Page.ProcessRequest() at System.Web.UI.Page.ProcessRequest(HttpContext context) at ASP.decide_aspx.ProcessRequest(HttpContext context) at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) at System.Web.HttpApplication.PipelineStepManager.ResumeSteps(Exception error) at System.Web.HttpApplication.BeginProcessRequestNotification(HttpContext context, AsyncCallback cb) at System.Web.HttpRuntime.ProcessRequestNotificationPrivate(IIS7WorkerRequest wr, HttpContext context) at System.Web.Hosting.PipelineRuntime.ProcessRequestNotificationHelper(IntPtr managedHttpContext, IntPtr nativeRequestContext, IntPtr moduleData, Int32 flags) at System.Web.Hosting.PipelineRuntime.ProcessRequestNotification(IntPtr managedHttpContext, IntPtr nativeRequestContext, IntPtr moduleData, Int32 flags) at System.Web.Hosting.PipelineRuntime.ProcessRequestNotificationHelper(IntPtr managedHttpContext, IntPtr nativeRequestContext, IntPtr moduleData, Int32 flags) at System.Web.Hosting.PipelineRuntime.ProcessRequestNotification(IntPtr managedHttpContext, IntPtr nativeRequestContext, IntPtr moduleData, Int32 flags) 2010-04-28 12:38:38,149 (GMT-7) [14] INFO DotNetOpenAuth.Yadis - Relying party discovery at URL http://myrelyingparty/ failed. DotNetOpenAuth.Messaging.ProtocolException: An HTTP request to the realm URL (http://myrelyingparty/) resulted in a redirect, which is not allowed during relying party discovery. at DotNetOpenAuth.Messaging.ErrorUtilities.VerifyProtocol(Boolean condition, String message, Object[] args) in c:\TeamCity\buildAgent\work\bf9e2ca68b75a334\src\DotNetOpenAuth\Messaging\ErrorUtilities.cs:line 235 at DotNetOpenAuth.OpenId.Realm.Discover(IDirectWebRequestHandler requestHandler, Boolean allowRedirects) in c:\TeamCity\buildAgent\work\bf9e2ca68b75a334\src\DotNetOpenAuth\OpenId\Realm.cs:line 446 at DotNetOpenAuth.OpenId.Realm.DiscoverReturnToEndpoints(IDirectWebRequestHandler requestHandler, Boolean allowRedirects) in c:\TeamCity\buildAgent\work\bf9e2ca68b75a334\src\DotNetOpenAuth\OpenId\Realm.cs:line 424 at DotNetOpenAuth.OpenId.Provider.HostProcessedRequest.IsReturnUrlDiscoverableCore(OpenIdProvider provider) in c:\TeamCity\buildAgent\work\bf9e2ca68b75a334\src\DotNetOpenAuth\OpenId\Provider\HostProcessedRequest.cs:line 142 2010-04-28 12:38:42,076 (GMT-7) [8] ERROR OpenIdProviderWebForms.Global - An unhandled exception was raised. Details follow: System.Web.HttpUnhandledException: Exception of type 'System.Web.HttpUnhandledException' was thrown. --- System.InvalidOperationException: Sequence contains more than one element at System.Linq.Enumerable.SingleOrDefault[TSource](IEnumerable`1 source) at DotNetOpenAuth.OpenId.Provider.Request.GetExtension[T]() in c:\TeamCity\buildAgent\work\bf9e2ca68b75a334\src\DotNetOpenAuth\OpenId\Provider\Request.cs:line 176 at DotNetOpenAuth.OpenId.Extensions.ExtensionsInteropHelper.ConvertSregToMatchRequest(IHostProcessedRequest request) in c:\TeamCity\buildAgent\work\bf9e2ca68b75a334\src\DotNetOpenAuth\OpenId\Extensions\ExtensionsInteropHelper.cs:line 180 at DotNetOpenAuth.OpenId.Behaviors.AXFetchAsSregTransform.DotNetOpenAuth.OpenId.Provider.IProviderBehavior.OnOutgoingResponse(IAuthenticationRequest request) in c:\TeamCity\buildAgent\work\bf9e2ca68b75a334\src\DotNetOpenAuth\OpenId\Behaviors\AXFetchAsSregTransform.cs:line 139 at DotNetOpenAuth.OpenId.Provider.OpenIdProvider.ApplyBehaviorsToResponse(IRequest request) in c:\TeamCity\buildAgent\work\bf9e2ca68b75a334\src\DotNetOpenAuth\OpenId\Provider\OpenIdProvider.cs:line 482 at DotNetOpenAuth.OpenId.Provider.OpenIdProvider.SendResponse(IRequest request) in c:\TeamCity\buildAgent\work\bf9e2ca68b75a334\src\DotNetOpenAuth\OpenId\Provider\OpenIdProvider.cs:line 325 at OpenIdProviderWebForms.decide.Yes_Click(Object sender, EventArgs e) in C:\Projects\OpenIdProviderWebForms\decide.aspx.cs:line 130 at System.Web.UI.WebControls.Button.OnClick(EventArgs e) at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) --- End of inner exception stack trace --- at System.Web.UI.Page.HandleError(Exception e) at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) at System.Web.UI.Page.ProcessRequest(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) at System.Web.UI.Page.ProcessRequest() at System.Web.UI.Page.ProcessRequest(HttpContext context) at ASP.decide_aspx.ProcessRequest(HttpContext context) in c:\Windows\Microsoft.NET\Framework64\v2.0.50727\Temporary ASP.NET Files\root\7f580b93\b3e4d917\App_Web_tulh9ymv.1.cs:line 0 at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)

    Read the article

  • How can I inform search engines that the usefulness of some content on my site has a limited shelf life?

    - by Tim Post
    Let's say that I run a forum dedicated to computer hardware. Naturally, people are going to ask questions like: What is the best laptop for running [os] Or What is the best video card for under [amount] These may be perfectly fine discussions, but the content loses usefulness over time. An answer to either question asked in 2007 might still be relevant in 2008, but definitely not in 2012. Is there a way that I can tell search engines that certain pages might not give visitors what they're looking for after a certain date, and perhaps hint to a page on my site that would provide good information? Perhaps something I could set in HTTP response headers, meta tags or even a site map?

    Read the article

  • XNA extending an existing Content type

    - by Maarten
    We are doing a game in XNA that reacts to music. We need to do some offline processing of the music data and therefore we need a custom type containing the Song and some additional data: // Project AudioGameLibrary namespace AudioGameLibrary { public class GameTrack { public Song Song; public string Extra; } } We've added a Content Pipeline extension: // Project GameTrackProcessor namespace GameTrackProcessor { [ContentSerializerRuntimeType("AudioGameLibrary.GameTrack, AudioGameLibrary")] public class GameTrackContent { public SongContent SongContent; public string Extra; } [ContentProcessor(DisplayName = "GameTrack Processor")] public class GameTrackProcessor : ContentProcessor<AudioContent, GameTrackContent> { public GameTrackProcessor(){} public override GameTrackContent Process(AudioContent input, ContentProcessorContext context) { return new GameTrackContent() { SongContent = new SongProcessor().Process(input, context), Extra = "Some extra data" // Here we can do our processing on 'input' }; } } } Both the Library and the Pipeline extension are added to the Game Solution and references are also added. When trying to use this extension to load "gametrack.mp3" we run into problems however: // Project AudioGame protected override void LoadContent() { AudioGameLibrary.GameTrack gameTrack = Content.Load<AudioGameLibrary.GameTrack>("gametrack"); MediaPlayer.Play(gameTrack.Song); } The error message: Error loading "gametrack". File contains Microsoft.Xna.Framework.Media.Song but trying to load as AudioGameLibrary.GameTrack. AudioGame contains references to both AudioGameLibrary and GameTrackProcessor. Are we maybe missing other references? EDIT Selecting the correct content processor helped, it loads the audio file correctly. However, when I try to process some data, e.g: public override GameTrackContent Process(AudioContent input, ContentProcessorContext context) { int count = input.Data.Count; // With this commented out it works fine return new GameTrackContent() { SongContent = new SongProcessor().Process(input, context) }; } It crashes with the following error: Managed Debugging Assistant 'PInvokeStackImbalance' has detected a problem in 'C:\Users\Maarten\Documents\Visual Studio 2010\Projects\AudioGame\DebugPipeline\bin\Debug\DebugPipeline.exe'. Additional Information: A call to PInvoke function 'Microsoft.Xna.Framework.Content.Pipeline!Microsoft.Xna.Framework.Content.Pipeline.UnsafeNativeMethods+AudioHelper::OpenAudioFile' has unbalanced the stack. This is likely because the managed PInvoke signature does not match the unmanaged target signature. Check that the calling convention and parameters of the PInvoke signature match the target unmanaged signature. Information from logger right before crash: Using "BuildContent" task from assembly "Microsoft.Xna.Framework.Content.Pipel ine, Version=4.0.0.0, Culture=neutral, PublicKeyToken=842cf8be1de50553". Task "BuildContent" Building gametrack.mp3 -> bin\x86\Debug\Content\gametrack.xnb Rebuilding because asset is new Importing gametrack.mp3 with Microsoft.Xna.Framework.Content.Pipeline.Mp3Imp orter Im experiencing exactly this: http://forums.create.msdn.com/forums/t/75996.aspx

    Read the article

  • Problem with user generated content

    - by grasshopper
    In general, what do you think is better in regards to adding content to a site, to allow users to add content to the site and put a flag button to report it if it doesn't fit with the site, or should only I add the content and remove that option? It will be a small site but I don't know if I'll manage to scan the site constantly or deal with the flags and on the other hand I'm worried that the site wont move forward because there will be lot less content, thoughts?

    Read the article

  • Apache Content negotiation returns wrong Content-Type with Mulitviews :(

    - by edbras
    I am using Apache webserver 2.2 with Content Negotiation through Multiviews. I have: <Directory /home/develop/web/prodBuild> Options +MultiViews AddEncoding x-gzip .gz </Directory> This directory contains the gzip files. So if a browser requests the file bla.js, the server will return the file bla.js.gz as the file bla.js doesn't exists. However, the Content-Type on my Ubuntu server is set to be "application/x-gzip" which is wrong as it's a javascript file, so is has be "application/javascript" I have this working on my local Windows Apache server, but don't seem to get it working on the remote Ubuntu server. :(... NO idea why... Who/why is it setting this wrong Content-Type ? BTW: I don't have this line "AddType application/x-gzip .gz .tgz" somwhere in my config, as then I understand why it goes wrong. My workaround: add the following line under the above "AddEncoding": AddType "" .gz It will then set an empty string as Content-Type and then it works.. I think that the browser will try to discover the content type itself... I hope somebody can explain to me why this is not working and why this content type is set ? :(

    Read the article

  • Seriously, It’s Time to Get Your Content Act Together

    - by Mike Stiles
    Branded content, content marketing, social content, brand journalism, we’re seeing those terms more and more. Why? The technology tools are coming together. We should know. We can gather big data, crunch it, listen to the public, moderate, respond, get to know the customer intimately, know what they like, know what they want, we can target, distribute, amplify, measure engagement and reaction, modify strategy and even automate a great deal of all that. An amazing machine, a sleek, smooth-running engine has been built such that all the parts can interact and work together to deliver peak performance and maximum output. But that engine isn’t going anywhere without any gas. Content is the gas. Yes, we curate other people’s content. We can siphon their gas. There’s tech to help with that too. But as for the creation of original, worthwhile content made for a specific audience, our audience, machines can’t do that…at least not yet. Curated content is great. But somebody has to originate the content for it to be curated and shared. And since the need for good, curated content is obviously large and the desire to share is there, it’s a winning proposition for a brand to be a consistent producer of original content. And yet, it feels like content is an issue we’re avoiding. There’s a reluctance to build a massive pipeline if you have no idea what you’re going to run through it. The C-suite often doesn’t know what content is, that it’s different from ads, where to get it, who makes it, how long it should be, what the point of it is if there’s no hard sell of the product, what it costs, how to use it, how to measure it, how to make sure it’s good, or how to make sure it will keep flowing. It could be the reason many brands aren’t pulling the trigger on socially enabling the enterprise. And that’s a shame, because there are a lot of creative, daring, experimental, uniquely talented entertainers and journalists chomping at the bit to execute content for brands. But for many corporate executives, content is “weird,” and the people who make it are even weirder. The content side of the equation is human. It’s art, but art that can be informed by data. The natural inclination is for brands to turn to their agencies for such creative endeavors. But agencies are falling into one of two categories. They’re failing to transition from ads to content. In “Content Era, What’s the Role of Agencies?” Alexander Jutkowitz says agencies were made for one-hit campaigns, not ongoing content. Or, they’re ready and capable but can’t get clients to do the right things. Agencies have to make money, even if it means continuing to do the wrong things because that’s all the client will agree to. So what we wind up with in the pipeline is advertising, marketing-heavy content, content that was obviously created or spearheaded by non-creative executives, random & inconsistent content, copy written for SEO bots, and other completely uninteresting nightmares. Frank Rose, author of “The Art of Immersion,” writes, “Content without story and excitement is noise pollution.” In the old days, you made an ad and inserted it into shows made by people who knew what they were doing. You could bask in that show’s success and leverage their audience. Now, you are tasked with attracting, amassing and holding your own audience. You may just want to make, advertise and sell your widgets. But now there’s a war on for a precious commodity, attention. People are busy. They have filters to keep uninteresting and irrelevant things out. They value their time and expect value back when they give it up. Joe Pulizzi, founder of the Content Marketing Institute, says, "Your customers don't care about you, your products, your services…they care about themselves, their wants and their needs." Is it worth getting serious about content and doing it right? 61% of consumers feel better about a company that delivers custom content (Custom Content Council). Interesting content is one of the top 3 reasons people follow brands on social (Content+). 78% of consumers think organizations that provide custom content want to build good relationships with them (TMG Custom Media). On the B2B side, 80% of business decision makers prefer to get company info in a series of articles vs. an ad. So what’s the hang-up? Cited barriers to content marketing are lack of human resources (42%) and lack of budget (35%). 54% of brands don’t have a single on-site, dedicated content creator. And only 38% of brands have a content marketing strategy. Tech has built the biggest, most incredible stage for brands that’s ever been built. Putting something on that stage is your responsibility. Do a bad show, or no show at all, and you’ll be the beautiful, talented actress that never got discovered. @mikestilesPhoto: Gabriella Fabbri, stock.xchng

    Read the article

  • Country specific content vs global content

    - by Ando
    I have a global product presentation website myproduct.com For certain countries I also own the country domain: myproduct.co.uk, myproduct.com.au, myproduct.es, myproduct.de, etc. The presentation website is translated in multiple languages and I set up redirects: myproduct.es will redirect to myproduct.com/es/, myproduct.de will redirect to myproduct.com/de/, etc. . The content so far is the same, just translated in different languages. The advantages are that it's easy to keep the content aligned - everything is managed from one centralized dashboard (I'm using Wordpress with qtranslate). Now I'm running into trouble as for different countries I want localized content - for UK I want to run different promotions and use a different reseller than for .com.au so I would like that users coming from myproduct.co.uk see something different than those coming from myproduct.com.au (and not be redirected to myproduct.com as they are right now). How can I achieve this? I could duplicate the whole main website and modify only certain parts but then I would have a lot of duplicate content (e.g. info about how the product works) and I would have pages that are likely to change (FAQ page) that I would have to keep updated over all websites. I can duplicate only partially the main website: on the localized website I would have only the pages that are different and then all other links would point to the .com site. This would solve the duplication problem but would cause confusion for the user as you would navigate from .co.uk to .com without noticing and then wonder how to get back. Other, better option?

    Read the article

  • How do I dynamically reload content files?

    - by Kikaimaru
    Is there a relatively simple way to dynamically reload content files, such as effect files? I know I can do the following: Detect change of file Run content pipeline to rebuild that specific file Unload ALL content that was loaded Load all content And use double references to reference content files. The problem is with step 3 (and step 2 isn't that nice either). I need to unload everything because if I have model Hero.x which references Model.fx effect, and I change the Model.fx file, I need to reload the Hero.x file which will then call LoadExternalReference on Model.fx. Has someone managed to make this work without rewriting the whole ContentManager (and every ContentReader) and tracking calls to LoadExternalReference?

    Read the article

  • UPK Pre-Built Content Update

    - by Karen Rihs
    UPK pre-built content development efforts are always underway and growing. Over the last few months, the following new and upgraded modules became available:  NEW CONTENT RELEASES E-Business Suite 12.1 Field Service Manufacturing Operations Center Process Manufacturing:  System Administration Strategic Network Optimization U.S. Federal Financials Oracle Communications 11.1 Oracle Communications UPK for Pricing Design Center, Voice and Data Offerings Oracle Mobile Workforce 2.1.0 Administrative Setup User Tasks Primavera Primavera Portfolio Management 9.0 UPK CONTENT UPGRADES JDE E1 9.1 HCM Fundamentals for EnterpriseOne Manufacturing - Product Data Management Manufacturing Management Discrete Shop Floor Management Procurement and Subcontract Management JDE World A9.3 Accounts Payable Address Book  Common Foundation General Ledger For a list of modules currently available for each product line, visit the UPK Resource Library on Oracle.com. For more information on how your organization can take advantage of UPK pre-built content, see our previous blog,  The Value of UPK Pre-Built Content. - Karen Rihs, UPK Outbound Product Management

    Read the article

  • Dynamic content realoding

    - by Kikaimaru
    Is there a relatively simple way to dynamicaly reload content files? (ie: effect files) I know i can do following: Detect change of file Run content pipeline to rebuild that specific file Unload ALL content that was loaded Load All content And use double references to reference content files. Problem is with step 3 (and step 2 isn't that nice too). But i need to unload everything because if i have model Hero.x which references Model.fx effect, and i change Model.fx file, i need to reload Hero.x file which will then call LoadExternalReference on Model.fx. So I guess question is, did someone mange to make this work without rewriting whole ContentManager (and every ContentReader) and tracking calls to LoadExternalReference?

    Read the article

  • Collaborate 2010 Recap: A lot of Excitement for Oracle Content Management 11g

    - by [email protected]
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Collaborate brought me to Las Vegas last week and what a week it was.  Each day was jam packed with Oracle Content Management sessions, and almost every session I attended was full.  Across the 35+ sessions that were given by my Oracle peers, Oracle partners, and Oracle customers, the majority of the discussion and questions that were asked had to do with the release of Oracle Content Management 11g.  Just to bring everyone up-to-speed, the first wave of Oracle Content Management 11g releases happened this past January as Oracle Imaging & Process Management and Oracle Information Rights Management went GA.  The next wave, which should be released soon, includes Oracle Universal Content Management and Oracle Universal Records Management. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Andy MacMillan and Roel Stalman kicked off these discussions last Monday, as they presented Oracle Content Management's product strategy and roadmap.  It seemed that the attendees liked what they heard regarding the strategy and future direction, but the question that seems to always come up after roadmap presentations is "when will the product be released"?  This is a question that none of us have the power to answer, but soon customers will be able to enjoy these new product capabilities: Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Unified content repository across ECMCentralized installation, access, administration & monitoringCertified application integrations with solution templatesOpen Web Content Management Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Stay tuned for more news about the release of Oracle Universal Content Management and Oracle Records Management.  There are a lot of new assets currently being built that will help get everyone up-to-speed quickly. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Outside of the sessions that were presented, there were a lot of other activities that took place at Collaborate.  The Enterprise 2.0 solutions demo pod was busy, and attendees were anxious to see demonstrations of Oracle's end-to-end document imaging solution, WebCenter Spaces, and web site creation using Oracle Universal Content Management.   I also want to thank our partners (Fishbowl Solutions, Redstone Content Solutions, Bezzotech, Team Informatics, and DTI) for their efforts in creating detailed, insightful presentations.  Also, special thanks are in order to Thomas Feldmeier and Markus Neubauer of Silbury IT-Beratung GmbH for their participation.  It seems that Thomas and Markus were doomed to be stranded in Frankfurt after the Icelandic ash storm.  They couldn't get a flight out of their native Germany, and with fear that they would miss Collaborate, they rented a car and drove to Rome - some 800 miles (1,200 kilometers).  Anyway, they made it safe and sound to Las Vegas, and although probably a bit tired, they gave 2 Oracle Content Management presentations.  Talk about commitment. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Finally, a very special thanks to Al Hoof and Dave Chaffee of the Oracle Content Management Special Interest Group (SIG).  Al and Dave did most of the heavy lifting for Collaborate, including the coordination of all the sessions.  The Independent Oracle Users Group presented Al with the Chris Wooldridge award, recognizing him as the volunteer of the year.  Here is Al with his award: Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} I hope to see you next year at Collaborate as the show returns to Orlando.

    Read the article

  • ?Portal Content Personalization

    - by john.brunswick
    To make the most effective use of a portal and content management platform, personalization is a critical component of delivering the most value to end users. Regardless of what type of constituents you may be serving, content relevance is critical to support business goals like self-service, communication within a geographically distributed organization, lead generation and customer loyalty effectively. This especially holds true when serving external parties, as they generally have a lower threshold for digging through your site to locate a particular item of interest and are apt to leave or dial a helpdesk if their efforts cannot locate the relevant information. Optimal delivery of content can be achieved through a variety of methods, but it is generally a blend of security and filtering via meta data that can drive the most return with the least amount of upfront effort and ongoing upkeep. In a portal environment various platform components have their strong suits and by combining the capabilities of enterprise portal and content platforms much of the groundwork for personalization can be achieved in a configuration-based manner. In our discussion we will cover terminology and concepts, example scenarios and technical implementation strategies to help showcase how personalization of content can be achieved within a portal from a technical and strategic standpoint. Read on to better understand the chart below and the components at our disposal to personalize content delivery. Read on... click here to view a full size chart

    Read the article

  • XNA 4.0: Problem with loading XML content files

    - by 200
    I'm new to XNA so hopefully my question isn't too silly. I'm trying to load content data from XML. My XML file is placed in my Content project (the new XNA 4 thing), called "Event1.xml": <?xml version="1.0" encoding="utf-8" ?> <XnaContent> <Asset Type="MyNamespace.Event"> // error on this line <name>1</name> </Asset> </XnaContent> My "Event" class is placed in my main project: using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Xml.Linq; namespace MyNamespace { public class Event { public string name; } } The XML file is being called by my main game class inside the LoadContent() method: Event temp = Content.Load<Event>("Event1"); And this is the error I'm getting: There was an error while deserializing intermediate XML. Cannot find type "MyNamespace.Event" I think the problem is because at the time the XML is being evaluated, the Event class has not been established because one file is in my main project while the other is in my Content project (being a XNA 4.0 project and such). I have also tried changing the build action of the xml file from compile to content; however the program would then not be able to find the file and would give this other warning: Project item 'Event1.xml' was not built with the XNA Framework Content Pipeline. Set its Build Action property to Compile to build it. Any suggestions are appreciated. Thanks.

    Read the article

  • Extracting pure content / text from HTML Pages by excluding navigation and chrome content

    - by Ankur Gupta
    Hi, I am crawling news websites and want to extract News Title, News Abstract (First Paragraph), etc I plugged into the webkit parser code to easily navigate webpage as a tree. To eliminate navigation and other non news content I take the text version of the article (minus the html tags, webkit provides api for the same). Then I run the diff algorithm comparing various article's text from same website this results in similar text being eliminated. This gives me content minus the common navigation content etc. Despite the above approach I am still getting quite some junk in my final text. This results in incorrect News Abstract being extracted. The error rate is 5 in 10 article i.e. 50%. Error as in Can you Suggest an alternative strategy for extraction of pure content, Would/Can learning Natural Language rocessing help in extracting correct abstract from these articles ? How would you approach the above problem ?. Are these any research papers on the same ?. Regards Ankur Gupta

    Read the article

  • Using rel=next and rel=prev with multiple sets of paginated content on the same page

    - by jakejgordon
    We are running into issues with trying to figure out how to implement rel="next" and rel="prev" -- coupled with rel="canonical" -- with multiple sets of paginated content on the same page, with pages in multiple cultures. In other words, how do we implement these when we have a pager for both Product Reviews and Questions and Answers (aka "Q&A") on the same page, with duplicate content across culture-specific URLs (e.g. /us/en/my-product vs. /ca/en/my-product)? Our current implementation will actually do a full postback when you click Page 2, and will add something to the query string (e.g. website.com/ca/en/my-product?previewpage=2 or website.com/ca/en/my-product?questionpage=2). If we only had one set of paginated content then the implementation would certainly be more straightforward. Adding a second set of paginated content (i.e. Q&A) complicates things. Let's assume that we want the United States English page to be the canonical target (i.e. /us/en/my-product) based on culture. If you go to the /ca/en/my-product page you'll have a rel="canonical" href="/us/en/my-product". So far so good. Let's also assume that we are not implementing a page that lists ALL Product Reviews and Q&A. This would likely solve a number of our problems by using rel="canonical" to this page, but is not an option for reasons that are out of scope for this discussion. Now if you click on page 2 of Product Reviews, it will reload the page with /ca/en/my-product?reviewpage=2 as the URL. Given this scenario, here are my questions: On page 2 of the my-product page on the Canadian site, should there be a rel="canonical" to /us/en/my-product?reviewpage=2 (assuming the content is identical in the United States and Canada)? Should the rel="prev" go to /ca/en/my-product?reviewpage=1 or should it go to /ca/en/my-product ? The query-string version would really only be accessible if using the pager and shows the exact same content as the base page. The following two questions are closely related to this one. Should the /ca/en/my-product?reviewpage=1 have a rel canonical directly to /us/en/my-product (United States page with nothing in query string) since the content is identical)? Given that Q&A content is also paginated, should there be a rel="next" on the base page without query string? In other words, should the /ca/en/my-product page have a rel="next" to /ca/en/my-product?reviewpage=2 AND rel="next" to /ca/en/my-product?questionpage=2 . So far as I can tell it doesn't make sense to have multiple rel="next" implementations on the same page. I suspect that the pages with query string values should have rel="next" and rel="prev" that only point to other pages with query strings and not to the base page. The ?reviewpage=1 and ?questionpage=1 pages would then just have a rel="canonical" to /us/en/my-product . Thoughts? I know this is a tough one -- that's why I brought it to this community. Thanks so much for your help in advance!

    Read the article

  • User (MS-Office) generated content - how?

    - by Avi
    How can I allow users to share Microsoft Office generated content on an ASP.Net site? For example usage, imagine a site similar to StackOverflow. George, writing a question, uses Word, Excel or OneNote to create content, and then inserts the content into the question area (probably copying it into the clipboard and then using some "paste from office" widget). Harry, who doesn't have MS-Office on his computer, can still see in his browser the content George has generate. If Harry wants to add content, he can use the built in editor, same like in Stackoverflow, and have to be satisfied with lesser functionality. Sue, who has MS-Office installed, can of course see the content in the browser just like Harry. In addition, she can "export" this content and process it in the application George used to generate it. So, how do I do it? Would Save/Export to HTML feature work? Any tools? Samples? Articles? Office 2007 or later is OK.

    Read the article

  • Content Flood and Frustration? ==> Content Salvation via WebCenter

    - by Michael Snow
    If you are still stuck on Documentum and somehow have missed hearing about our Move-Off Documentum program. Here’s a refresher on the basics to help save you from your ongoing frustration. Check out Capitalizing on Content How much content have you pushed around over email, through collaboration, and via social channels this week? Have you been engaged with a process that has been smooth, problem-free, and leaves you with a lasting impression of a great user experience? How are you managing your online presence to ensure that your customers, partners, employees and potential future customers are experiencing a consistently engaging web experience? You really can’t do this without a solid Web Experience Management platform. Learn more from the CITO Research White Paper: Creating a Successful and Meaningful Customer Experience on the Web If you'd like to catch up on what Oracle WebCenter has been doing lately - check out the latest recording now available OnDemand: March 2012 Quarterly Customer Update Webcast. Amazing how much can be done in a day on the internet. It’s even more impressive to see this presented in an infographic like the one below by mbaonline.com. How much longer can you manage the ever increasing volumes of content without some technology assistance?  Created by: MBAOnline.com

    Read the article

  • Internationalize WebCenter Portal - Content Presenter

    - by Stefan Krantz
    Lately we have been involved in engagements where internationalization has been holding the project back from success. In this post we are going to explain how to get Content Presenter and its editorials to comply with the current selected locale for the WebCenter Portal session. As you probably know by now WebCenter Portal leverages the Localization support from Java Server Faces (JSF), in this post we will assume that the localization is controlled and enforced by switching the current browsers locale between English and Spanish. There is two main scenarios in internationalization of a content enabled pages, since Content Presenter offers both presentation of information as well as contribution of information, in this post we will look at how to enable seamless integration of correct localized version of the back end content file and how to enable the editor/author to edit the correct localized version of the file based on the current browser locale. Solution Scenario 1 - Localization aware content presentation Due to the amount of steps required to implement the enclosed solution proposal I have decided to share the solution with you in group components for each facet of the solution. If you want to get more details on each step, you can review the enclosed components. This post will guide you through the steps of enabling each component and what it enables/changes in each section of the system. Enable Content Presenter Customization By leveraging a predictable naming convention of the data files used to hold the content for the Content Presenter instance we can easily develop a component that will dynamically switch the name out before presenting the information. The naming convention we have leverage is the industry best practice by having a shared identifier as prefix (ContentABC) and a language enabled suffix (_EN) (_ES). So the assumption is that each file pair in above example should look like following:- English version - (ContentABC_EN)- Spanish version - (ContentABC_ES) Based on above theory we can now easily regardless of the primary version assigned to the content presenter instance switch the language out by using the localization support from JSF. Below java bean (oracle.webcenter.doclib.internal.view.presenter.NLSHelperBean) is enclosed in the customization project available for download at the bottom of the post: 1: public static final String CP_D_DOCNAME_FORMAT = "%s_%s"; 2: public static final int CP_UNIQUE_ID_INDEX = 0; 3: private ContentPresenter presenter = null; 4:   5:   6: public NLSHelperBean() { 7: super(); 8: } 9:   10: /** 11: * This method updates the configuration for the pageFlowScope to have the correct datafile 12: * for the current Locale 13: */ 14: public void initLocaleForDataFile() { 15: String dataFile = null; 16: // Checking that state of presenter is present, also make sure the item is eligible for localization by locating the "_" in the name 17: if(presenter.getConfiguration().getDatasource() != null && 18: presenter.getConfiguration().getDatasource().isNodeDatasource() && 19: presenter.getConfiguration().getDatasource().getNodeIdDatasource() != null && 20: !presenter.getConfiguration().getDatasource().getNodeIdDatasource().equals("") && 21: presenter.getConfiguration().getDatasource().getNodeIdDatasource().indexOf("_") > 0) { 22: dataFile = presenter.getConfiguration().getDatasource().getNodeIdDatasource(); 23: FacesContext fc = FacesContext.getCurrentInstance(); 24: //Leveraging the current faces contenxt to get current localization language 25: String currentLocale = fc.getViewRoot().getLocale().getLanguage().toUpperCase(); 26: String newDataFile = dataFile; 27: String [] uniqueIdArr = dataFile.split("_"); 28: if(uniqueIdArr.length > 0) { 29: newDataFile = String.format(CP_D_DOCNAME_FORMAT, uniqueIdArr[CP_UNIQUE_ID_INDEX], currentLocale); 30: } 31: //Replacing the current Node datasource with localized datafile. 32: presenter.getConfiguration().getDatasource().setNodeIdDatasource(newDataFile); 33: } 34: } With this bean code available to our WebCenter Portal implementation we can start the next step, by overriding the standard behavior in content presenter by applying a MDS Taskflow customization to the content presenter taskflow, following taskflow customization has been applied to the customization project attached to this post:- Library: WebCenter Document Library Service View- Path: oracle.webcenter.doclib.view.jsf.taskflows.presenter- File: contentPresenter.xml Changes made in above customization view:1. A new method invocation activity has been added (initLocaleForDataFile)2. The method invocation invokes the new NLSHelperBean3. The default activity is moved to the new Method invocation (initLocaleForDataFile)4. The outcome from the method invocation goes to determine-navigation (original default activity) The above changes concludes the presentation modification to support a compatible localization scenario for a content driven page. In addition this customization do not limit or disables the out of the box capabilities of WebCenter Portal. Steps to enable above customization Start JDeveloper and open your WebCenter Portal Application Select "Open Project" and include the extracted project you downloaded (CPNLSCustomizations.zip) Make sure the build out put from CPNLSCustomizations project is a dependency to your Portal project Deploy your Portal Application to your WC_CustomPortal managed server Make sure your naming convention of the two data files follow above recommendation Example result of the solution: Solution Scenario 2 - Localization aware content creation and authoring As you could see from Solution Scenario 1 we require the naming convention to be strictly followed, this means in the hands of a user with limited technology knowledge this can be one of the failing links in this solutions. Therefore I strongly recommend that you also follow this part since this will eliminate this risk and also increase the editors/authors usability with a magnitude. The current WebCenter Portal Architecture leverages WebCenter Content today to maintain, publish and manage content, therefore we need to make few efforts in making sure this part of the architecture is on board with our new naming practice and also simplifies the creation of content for our end users. As you probably remember the naming convention required a prefix to be common so I propose we enable a new component that help you auto name the content items dDocName (this means that the readable title can still be in a human readable format). The new component (WCP-LocalizationSupport.zip) built for this scenario will enable a couple of things: 1. A new service where a sequential number can be generate on request - service name: GET_WCP_LOCALE_CONTENTID 2. The content presenter is leveraging a specific function when launching the content creation wizard from within Content Presenter. Assumption is that users will create the content by clicking "Create Web Content" button. When clicking the button the wizard opened is actually running in side of WebCenter Content server, file executed (contentwizard.hcsp). This file uses JSON commands that will generate operations in the content server, I have extend this file to create two identical data files instead of one.- First it creates the English version by leveraging the new Service and a Global Rule to set the dDocName on the original check in screen, this global rule is available in a configuration package attached to this blog (NLSContentProfileRule.zip)- Secondly we run a set of JSON javascripts to create the Spanish version with the same details except for the name where we replace the suffix with (_ES)- Then content creation wizard ends with its out of the box behavior and assigns the Content Presenter instance the English versionSee Javascript markup below - this can be changed in the (WCP-LocalizationSupport.zip/component/WCP-LocalizationSupport/publish/webcenter) 1: //---------------------------------------A-TEAM--------------------------------------- 2: WCM.ContentWizard.CheckinContentPage.OnCheckinComplete = function(returnParams) 3: { 4: var callback = WCM.ContentWizard.CheckinContentPage.checkinCompleteCallback; 5: WCM.ContentWizard.ChooseContentPage.OnSelectionComplete(returnParams, callback); 6: // Load latest DOC_INFO_SIMPLE 7: var cgiPath = DOCLIB.config.httpCgiPath; 8: var jsonBinder = new WCM.Idc.JSONBinder(); 9: jsonBinder.SetLocalDataValue('IdcService', 'DOC_INFO_SIMPLE'); 10: jsonBinder.SetLocalDataValue('dID', returnParams.dID); 11: jsonBinder.Send(cgiPath, $CB(this, function(http) { 12: var ret = http.GetResponseText(); 13: var binder = new WCM.Idc.JSONBinder(ret); 14: var dDocName = binder.GetResultSetValue('DOC_INFO', 'dDocName', 0); 15: if(dDocName.indexOf("_") > 0){ 16: var ssBinder = new WCM.Idc.JSONBinder(); 17: ssBinder.SetLocalDataValue('IdcService', 'SS_CHECKIN_NEW'); 18: //Additional Localization dDocName generated 19: ssBinder.SetLocalDataValue('dDocName', getLocalizedDocName(dDocName, "es")); 20: ssBinder.SetLocalDataValue('primaryFile', 'default.xml'); 21: ssBinder.SetLocalDataValue('ssDefaultDocumentToken', 'SSContributorDataFile'); 22:   23: for(var n = 0 ; n < binder.GetResultSetFields('DOC_INFO').length ; n++) { 24: var field = binder.GetResultSetFields('DOC_INFO')[n]; 25: if(field != 'dID' && 26: field != 'dDocName' && 27: field != 'dID' && 28: field != 'dReleaseState' && 29: field != 'dRevClassID' && 30: field != 'dRevisionID' && 31: field != 'dRevLabel') { 32: ssBinder.SetLocalDataValue(field, binder.GetResultSetValue('DOC_INFO', field, 0)); 33: } 34: } 35: ssBinder.Send(cgiPath, $CB(this, function(http) {})); 36: } 37: })); 38: } 39:   40: //Support function to create localized dDocNames 41: function getLocalizedDocName(dDocName, lang) { 42: var result = dDocName.replace("_EN", ("_" + lang)); 43: return result; 44: } 45: //---------------------------------------A-TEAM--------------------------------------- 3. By applying the enclosed NLSContentProfileRule.zip, the check in screen for DataFile creation will have auto naming enabled with localization suffix (default is English)You can change the default language by updating the GlobalNlsRule and assign preferred prefix.See Rule markup for dDocName field below: <$executeService("GET_WCP_LOCALE_CONTENTID")$><$dprDefaultValue=WCP_LOCALE.LocaleContentId & "_EN"$> Steps to enable above extensions and configurations Install WebCenter Component (WCP-LocalizationSupport.zip), via the AdminServer in WebCenter Content Administration menus Enable the component and restart the content server Apply the configuration bundle to enable the new Global Rule (GlobalNlsRule), via the WebCenter Content Administration/Config Migration Admin New Content Creation Experience Result Content EditingContent editing will by default be enabled for authoring in the current select locale since the content file is selected by (Solution Scenario 1), this means that a user can switch his browser locale and then get the editing experience adaptable to the current selected locale. NotesA-Team are planning to post a solution on how to inline switch the locale of the WebCenter Portal Session, so the Content Presenter, Navigation Model and other Face related features are localized accordingly. Content Presenter examples used in this post is an extension to following post:https://blogs.oracle.com/ATEAM_WEBCENTER/entry/enable_content_editing_of_iterative Downloads CPNLSCustomizations.zip - WebCenter Portal, Content Presenter Customization https://blogs.oracle.com/ATEAM_WEBCENTER/resource/stefan.krantz/CPNLSCustomizations.zip WCP-LocalizationSupport.zip - WebCenter Content, Extension Component to enable localization creation of files with compliant auto naminghttps://blogs.oracle.com/ATEAM_WEBCENTER/resource/stefan.krantz/WCP-LocalizationSupport.zip NLSContentProfileRule.zip - WebCenter Content, Configuration Update Bundle to enable Global rule for new check in naming of data fileshttps://blogs.oracle.com/ATEAM_WEBCENTER/resource/stefan.krantz/NLSContentProfileRule.zip

    Read the article

  • The ethics of aggregating other peoples' content

    - by AaronBertrand
    Content theft Most of the recent - what I'll call - "discussions" about content aggregators have revolved around content theft. I'm not going to flip-flop on that topic: when you use someone else's work, you ask first, and if given the okay, you cite it. You can't take my blog article and post it on your site, implying that it is your own. Likewise, you shouldn't be taking a Books Online topic, changing three words and adding an intro sentence, and making that a blog post. I think this stuff is pretty...(read more)

    Read the article

  • Remove urls to unidex blog content from google, then copy blogs content to new blog [closed]

    - by sam
    Possible Duplicate: migrating PR / rankings from one site to another Ive been writing a blog for the past yr or so, with about 300 published articles, the blog have been running under a subdomain blog.mysite.com We are no looking to change the url of our site, so the blog is going to have to be ported over to a subdoamin on the new site. We would really like to keep the backlog / archive of all the articles we have written but dont wont to be penalized for having duplicate content, could we just remove / unindex the urls from google in webmaster tools then export the blog and import it back to our new blog ? Would google still see this a duplicate content or becuase ive removed the urls have they no longer got a copy of it ? thanks

    Read the article

  • SEO for Country & Language Specific content.

    - by kecebongsoft
    Currently I am creating a website which has a common topic for an article, but it's going to be different content for each country, and also, each of that content will be provided in several languages. And this mechanism exists in most of the parts in the website. For example, I have an article about tax. This article has to be different for each country, for example china. And tax content for china should be written in china AND english language (for non china-speaker). What is the best URL pattern to handle this? What I've been thinking is, using a sub folder (/country-code/language-code/) such as: www.example.com/cn/cn/tax www.example.com/cn/en/tax Or using top level domain such as: www.example.cn/cn/tax www.example.cn/en/tax Or subdomain such as cn.example.com/cn/tax cn.example.com/en/tax I think I will not prefer the last option since I might need to use subdomain for other purpose. Which left only subfolder and TLDN. I've read some articles saying that TLDN is good for localized content (language-specific content), but in my case, my TLDN will also has english contents (for non local speaker) which is specific only to that particular country (also the purpose of this is to let people from other country easily search it through google). What is the best pattern to pick and why?.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >