Search Results

Search found 21197 results on 848 pages for 'webcenter content'.

Page 46/848 | < Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >

  • ASP.NET Security Exception when Switch IIS7 to Use UNC Path for Content

    - by Jeremy H.
    I have a Windows Server 2008 R2 box running IIS7.5 with Medium Trust configured for ASP.NET. When I have the website running from local content (e.g.: c:\inetpub\wwwroot) everything works fine. When I change IIS to use a UNC path for the content (e.g.: \\computer\wwwroot) I get the following error: Security Exception Description: The application attempted to perform an operation not allowed by the security policy. To grant this application the required permission please contact your system administrator or change the application's trust level in the configuration file. Exception Details: System.Security.SecurityException: Request for the permission of type 'System.Data.SqlClient.SqlClientPermission, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed. I'm trying to figure out why ASP.NET/IIS would allow for the SQL call when using local content but not when using a UNC path. Any ideas what I need to do to use a UNC path from IIS7 properly?

    Read the article

  • Create Word files from Excel content

    - by Lennart
    I have an Excel file that I want to split into several files (Word, PDF is also good), based on content. The content is somewhat like this: Person Fase Date Item Text A 1 01-01-2012 Z Lorem ipsum A 2 01-02-2012 X Lorem ipsum B 1 02-01-2012 Y Lorem ipsum C 2 01-01-2012 Z Lorem ipsum I want Word/PDF documents with names like Person_Fase.docx And as content the date, item and text. Idealy in a table layout. Any hints/ clues on how to get there? It's about 700 clients, with up to 300 Excel entries each.

    Read the article

  • Disable static content caching in IIS 7

    - by Lee Richardson
    I'm a developer having what should be a relatively simple problem in IIS 7 on Windows Server 2008 R2. The problem is that IIS 7 is overzealously caching all static content on the server. It's caching all .html and .js content and not noticing when the content changes on disk unless I iisreset. I've tried the following: Deleting the local cache in my browser (I'm 99% positive this is a server caching issue) In IIS Admin in OutputCaching adding an .html extension and unchecking "User mode caching" and unchecking "Kernel-mode caching" In IIS Admin in OutputCaching adding an .html extension and checking "User mode caching" and selecting the radio for "Prevent all caching" In IIS Admin editing Output Cache Feature settings and unchecking "Enable cache" and "Enable kernel cache under OutputCaching. Running "C:\Windows\System32\inetsrv\config\appcmd set config "SharePoint - 80" -section: system.webServer/caching -enabled:false" Looking through applicationHost.config and disabling anything related to caching I could find. Nothing seems to work. I'm getting very frustrated. Can anyone please help?

    Read the article

  • SharePoint MOSS - Serve HTTP content on an HTTPS page without Mixed Content Warning?

    - by kcb263
    Our "portal-like" SharePoint site is served using HTTPS/SSL. So a user goes to https://web.company.com and sees content and different Web Parts. So far, no problem. The desire now is to have new Web Parts added that either frame HTTP content (such as Weather Bug) or HTTP RSS feeds. The issue that arises is that by doing this, results in a "Mixed Content" warning in the browser. Has anybody successfully been able to implement such a scenario, or one similar to it? The options we have looked at, unsuccessfully, have been: using Apache Reverse Proxy Server mirror an external site Custom Web Parts

    Read the article

  • Static Content Caching in Sticky Session ?

    - by Ravi
    can we do Static Content caching in Sticky Session Servers. We use SqlStateServer to store the Session of the user. right now we are doing performance tuning in our application, so we decided to cache the static Content(images, css, js) for the applicaiton. so that it loads faster. Is it Good to cache the static content in Sticky Session ? If it's good, then can any one give me some links where i can read about it. right now i done following settings in my web config file <staticContent> <clientCache cacheControlCustom="public" cacheControlMode="UseMaxAge" cacheControlMaxAge="500.00:00:00" /> </staticContent> can this is the good code ? will it not affect our sticky session environment ? my goal is to cache the static images, css, Js for the application

    Read the article

  • Mix content warning on ASPX page

    - by Amit
    Hi, We have started receiving the mixed content warning on ASPX pages on our secured site. We do not have any mix content, we load all our JS, Images, CSS and ASPX files using HTTPS. I dont know why we have started receiving these warnings now. The latest thing which we have added is the third party control for Dialog boxes from Essential Object. We are previously using their Menu control but added dialog box recently. Also we have made our application browser compatible. I feel the reason is something between these two points. Can anyone suggest any solution or any workaround if they know any or have used Essential Object controls and faced simililar issue? Essential object is saying it is not their problem. The mix content warning appears any time and not specifically when the Essential Control dialog box popsup, thats why I am bit confused. Any help is highly appriciated. Thanks.

    Read the article

  • Setting correct Content-Type sent from Wordpress, on Apache server

    - by eoinoc
    I need help pointing me in the right direction for setting the ContentType returned by Apache for content produced by WordPress. I'm having trouble figuring out why WordPress is returning incorrect headers. Issue The specific problem is that our Wordpress blog pages are being downloaded as a file rather than displayed by Internet Explorer and Chrome v21. Content-Type: application/x-gzip is being returned by the server. I'm told that I should expect Content-Type: text/html. Background The URL is http://www.bitesizeirishgaelic.com/blog/.

    Read the article

  • How do I resize an iframe with dynamic content?

    - by Middletone
    I'm populating an iframe with some contents that are injected into it but I'm unable to size it correctly. I've tried a variety of methods but here is the most recent code. After populating the iframe with an html based mail message it doesn't seem to register the scroll height. <iframe scrolling="no" width="100%" onload="javascript:(LoadIFrame(this));" > HTML content goes here. Make sure it's long enough to need to scroll before testing. </iframe> <script type="text/javascript"> function LoadIFrame(iframe) { $("iframe").each(function() { $(this).load(function() { var doc = this.document; if (this.contentDocument) { doc = this.contentDocument; } else if (this.contentWindow) { doc = this.contentWindow.document; } else if (this.document) { doc = this.document; } this.height = (doc.body.scrollHeight + 50) + 'px'; this.onload = ''; }); txt = $(this).text(); var doc = this.document; if (this.contentDocument) { doc = this.contentDocument; } else if (this.contentWindow) { doc = this.contentWindow.document; } else if (this.document) { doc = this.document; } doc.open(); doc.writeln(txt); doc.close(); }); } </script>

    Read the article

  • Can I place AdSense ads on a site with drug content?

    - by Joe Majewski
    I am going to be opening a new WordPress blog today, and I was curious to know if my site will be appropriate for AdSense ads. First off, the blog is going to be about drug addiction. More specifically, opiate addiction. I myself was addicted to pain killers for three years, and I am pleased to say that I am now in treatment and I have not used in over a month now. :) Anyways, my site will be an informative blog about opiates and the harm that they can induce on one's life. It will in no way endorse the use of drugs, but I may talk about my negative experiences. It will be 100% (without a doubt) evident upon checking out my website that my view on drugs is negative. Two questions: (1) Is this content acceptable for Google's terms of service? (2) In order to begin displaying AdSense ads, is there anything I must first do? I already have AdSense ads running on one of my domains. Can I simply begin pasting the code snippets into my new domain and watch as the ads begin to work? Thanks for your time. :)

    Read the article

  • USDM and Oracle Offer a New Part 11 Compliant Solution for Life Sciences

    - by Michael Snow
    Guest post today provided by Oracle partner, USDM  Regulated Content in WebCenterUSDM and Oracle offer a new Part 11 compliant solution for Life Sciences (White Paper) Life science customers now have the ability to take advantage of all of the benefits of Oracle’s WebCenter Content, a global leader in Enterprise Content Management.   For the past year, USDM has been developing best practice compliance solutions to meet regulated content management requirements for 21 CFR Part 11 in WebCenter Content. USDM has been an expert in ECM for life sciences since 1999 and in 2011, certified that WebCenter was a 21CFR Part 11 compliant content management platform (White Paper).  In addition, USDM has built Validation Accelerators Packs for WebCenter to enable life science organizations to quickly and cost effectively validate this world class solution.With the Part 11 certification, Oracle’s WebCenter now provides regulated life science organizations  the ability to manage REGULATORY content in WebCenter, as well as the ability to take advantage of ALL of the additional functionality of WebCenter, including  a complete, open, and integrated portfolio of portal, web experience management, content management and social networking technology.  Here are a few screen shot examples of Part 11 functionality included in the product: E-Sign, E-Sign Rendor, Meta Data History, Audit Trail Report, and Access Reporting. Gone are the days that life science companies have to spend millions of dollars a year to implement, maintain, and validate ECM systems that no longer meet the ever changing business and regulatory requirements.  Life science companies now have the ability to use WebCenter Content, an ECM system with a substantially lower cost of ownership and unsurpassed functionality.Oracle has been #1 in life sciences because of their ability to develop cost effective, easy-to-use, scalable solutions which help increase insight and efficiency to drive growth for their customers.  Adding a world class ECM solution to this product portfolio allows life science organizations the chance to get rid of costly ECM systems that no longer meet their needs and use WebCenter, part of the Oracle Fusion Technology stack, with their other leading enterprise applications.USDM provides:•    Expertise in Life Science ECM Business Processes•    Prebuilt Life Science Configuration in WebCenter •    Validation Accelerator Packs for WebCenterUSDM is very proud to support Oracle’s expanding commitment to Life Sciences…. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} For more information please contact:  [email protected] Oracle will be exhibiting at DIA 2012 in Philadelphia on June 25-27. Stop by our booth (#2825) to learn more about the advantages of a centralized ECM strategy and see the Oracle WebCenter Content solution, our 21 CFR Part 11 compliant content management platform.

    Read the article

  • getting base64 content string of an image from a mimepart in Java

    - by Bas van den Broek
    Hello, I am trying to get the base64 content of a MimePart in a MimeMultiPart, but I'm struggling with the Javamail package. I simply want the base64 encoded String of a certain inline image, there doesn't seem to be an easy way to do this though. I wrote a method that will take the mime content (as a string) and an image name as a parameter, and searches for the part that contains the base64 content of that image name, and in the end returns this base64 string (as well as the content type but that is irrelevant for this question) Here is the relevant code: private static String[] getBase64Content(String imageName, String mimeString) throws MessagingException, IOException { System.out.println("image name: " + imageName + "\n\n"); System.out.println("mime string: " + mimeString); String[] base64Content = new String[2]; base64Content[0] = ""; base64Content[1] = "image/jpeg"; //some default value DataSource source = new ByteArrayDataSource(new ByteArrayInputStream(mimeString.getBytes()), "multipart/mixed"); MimeMultipart mp = new MimeMultipart(source); for (int i = 0; i < mp.getCount(); i++) { MimePart part = (MimePart) mp.getBodyPart(i); String disposition = part.getDisposition(); if (disposition != null && disposition.equals(Part.INLINE)) { if (part.getContentID() != null && part.getContentID().indexOf(imageName) > -1) //check if this is the right part { if (part.getContent() instanceof BASE64DecoderStream) { BASE64DecoderStream base64DecoderStream = (BASE64DecoderStream) part.getContent(); StringWriter writer = new StringWriter(); IOUtils.copy(base64DecoderStream, writer); String base64decodedString = writer.toString(); byte[] encodedMimeByteArray = Base64.encodeBase64(base64decodedString.getBytes()); String encodedMimeString = new String(encodedMimeByteArray); System.out.println("encoded mime string: " + encodedMimeString); base64Content[0] = encodedMimeString; base64Content[1] = getContentTypeString(part); } } } } return base64Content; } I cannot paste all of the output as the post would be too long, but this is some of it: image name: [email protected] This is a part of the mimeString input, it does find this (correct) part with the image name: --_004_225726A14AF9134CB538EE7BD44373A04D9E3F3940menexch2007ex_ Content-Type: image/gif; name="image001.gif" Content-Description: image001.gif Content-Disposition: inline; filename="image001.gif"; size=1070; creation-date="Fri, 02 Apr 2010 16:19:43 GMT"; modification-date="Fri, 02 Apr 2010 16:19:43 GMT" Content-ID: <[email protected]> Content-Transfer-Encoding: base64 R0lGODlhEAAQAPcAABxuHJzSlDymHGy2XHTKbITCdNTu1FyqTHTCXJTKhLTarCSKHEy2JHy6bJza lITKfFzCPEyWPHS+XHzCbJzSjFS+NLTirBx6HHzKdOz27GzCZJTOjCyWHKzWpHy2ZJTGhHS+VLzi (more base64 string here that I'm not going to paste) But when it finally prints the encoded mime string, this is a different string than I was expecting: encoded mime string: R0lGODlhEAAQAO+/vQAAHG4c77+90pQ877+9HGzvv71cdO+/vWzvv73vv71077+977+977+9XO+/vUx077+9XO+/vcqE77+92qwk77+9HEzvv70kfO+/vWzvv73alO+ Clearly different from the one that has its output in the part above. I'm not even sure what I'm looking at here, but when I try to load this as an image in a html page, it won't work. This is fairly frustrating for me, since all I want is a piece of the text that I'm already printing, but I'd rather not have to search through the mime string myself for the correct part, introducing all kinds of bugs.So I'd really prefer to use the Javamail library but could use some help on how to actually get that correct mime string.

    Read the article

  • Ajax loaded content , jquery plugins not working

    - by Sylph
    Hello, I have a link that calls the ajax to load content, and after the content is loaded, my jquery function doesn't work anymore Here is my HTML <a href="#" onclick="javascript:makeRequest('content.html','');">Load Content</a> <span id="result"> <table id="myTable" valign="top" class="tablesorter"> <thead> <tr> <th>Title 1</th> <th>Level 1</th> <th>Topics</th> <th>Resources</th> </tr> </thead> <tbody> <tr> <td>Example title 1</td> <td>Example level 1</td> </tr> <tr> <td>Example title 2</td> <td>Example level 2</td> </tr> </tbody> </table> </span> The table is sorted using jquery table sorter plugin from http://tablesorter.com/docs/ After the ajax content is loaded, another set of table with different data will be displayed. However, the sorting doesn't work anymore. Here is my ajax script which is use to load the content : var http_request = false; function makeRequest(url, parameters) { http_request = false; if (window.XMLHttpRequest) { // Mozilla, Safari,... http_request = new XMLHttpRequest(); if (http_request.overrideMimeType) { // set type accordingly to anticipated content type //http_request.overrideMimeType('text/xml'); http_request.overrideMimeType('text/html'); } } else if (window.ActiveXObject) { // IE try { http_request = new ActiveXObject("Msxml2.XMLHTTP"); } catch (e) { try { http_request = new ActiveXObject("Microsoft.XMLHTTP"); } catch (e) {} } } if (!http_request) { alert('Cannot create XMLHTTP instance'); return false; } http_request.onreadystatechange = alertContents; http_request.open('GET', url + parameters, true); http_request.send(null); } function alertContents() { if (http_request.readyState == 4) { // alert(http_request.status); if (http_request.status == 200) { //alert(http_request.responseText); result = http_request.responseText; document.getElementById('result').innerHTML = result; } else { alert('There was a problem with the request.'); } } } Any idea how I can get the jquery plugins to work after the content is loaded? I have searched and changed the jquery.tablesorter.js click function into live() like this $headers.live("click",function(e) but it doesn't work as well. How can I make the jquery functions to work after the content is loaded? Thank you

    Read the article

  • How to create a link to Nintex Start Workflow Page in the document set home page

    - by ybbest
    In this blog post, I’d like to show you how to create a link to start Nintex Workflow Page in the document set home page. 1. Firstly, you need to upload the latest version of jQuery to the style library of your team site. 2. Then, upload a text file to the style library for writing your own html and JavaScript 3. In the document set home page, insert a new content editor web part and link the text file you just upload. 4. Update the text file with the following content, you can download this file here. <script type="text/javascript" src="/Style%20Library/jquery-1.9.0.min.js"></script> <script type="text/javascript" src="/_layouts/sp.js"></script> <script type="text/javascript"> $(document).ready(function() { listItemId=getParameterByName("ID"); setTheWorkflowLink("YBBESTDocumentLibrary"); }); function buildWorkflowLink(webRelativeUrl,listId,itemId) { var workflowLink =webRelativeUrl+"_layouts/NintexWorkflow/StartWorkflow.aspx?list="+listId+"&ID="+itemId+"&WorkflowName=Start Approval"; return workflowLink; } function getParameterByName(name) { name = name.replace(/[\[]/, "\\\[").replace(/[\]]/, "\\\]"); var regexS = "[\\?&]" + name + "=([^&#]*)"; var regex = new RegExp(regexS); var results = regex.exec(window.location.search); if(results == null){ return ""; } else{ return decodeURIComponent(results[1].replace(/\+/g, " ")); } } function setTheWorkflowLink(listName) { var SPContext = new SP.ClientContext.get_current(); web = SPContext.get_web(); list = web.get_lists().getByTitle(listName); SPContext.load(web,"ServerRelativeUrl"); SPContext.load(list, 'Title', 'Id'); SPContext.executeQueryAsync(setTheWorkflowLink_Success, setTheWorkflowLink_Fail); } function setTheWorkflowLink_Success(sender, args) { var listId = list.get_id(); var listTitle = list.get_title(); var webRelativeUrl = web.get_serverRelativeUrl(); var startWorkflowLink=buildWorkflowLink(webRelativeUrl,listId,listItemId) $("a#submitLink").attr('href',startWorkflowLink); } function setTheWorkflowLink_Fail(sender, args) { alert("There is a problem setting up the submit exam approval link"); } </script> <a href="" target="_blank" id="submitLink"><span style="font-size:14pt">Start the approval process.</span></a> 5. Save your changes and go to the document set Item, you will see the link is on the home page now. Notes: 1. You can create a link to start the workflow using the following build dynamic string configuration: {Common:WebUrl}/_layouts/NintexWorkflow/StartWorkflow.aspx?list={Common:ListID}&ID={ItemProperty:ID}&WorkflowName=workflowname. With this link you will still need to click the start button, this is standard SharePoint behaviour and cannot be altered. References: http://connect.nintex.com/forums/27143/ShowThread.aspx How to use html and JavaScript in Content Editor web part in SharePoint2010

    Read the article

  • Employee Engagement Q&A with John Brunswick

    - by Kellsey Ruppel
    As we are focusing this week on Employee Engagement, I recently sat down with industry expert and thought leader John Brunswick on the topic. Here is the Q&A dialogue we shared.  Q: How do you effectively engage employees to drive business value?A: Motivation, both extrinsic and intrinsic, combined with the relevancy of various channels to support it.  Beyond chaining business strategies like compensation models within an organization, engagement ultimately is most successful when driven by employee's motivations.  Business value derived from engagement through technical capabilities can be objectively measured through metrics like the rate and accuracy of problem solving for a given business function or frequency of innovation created.  Providing employees performing "knowledge work" with capabilities that allow them to perform work with a higher degree of accuracy in the same or ideally less time, adds value for that individual and in turn, drives their level of engagement to drive business value. Q: Organizations with high levels of employee engagement outperform the total stock market index by 22%. Can you comment on why you think this might be? A: Alignment through shared purpose.  Zappos is an excellent example of a culture that arguably has higher than average levels of employee engagement and it permeates every aspect of their organization – embodied externally through their customer experience.  I recently made my first purchase with them and it was obvious through their web experience, visual design, communication style, customer service and attention to detail down to green packaging, that they have an amazingly strong shared purpose.  The Zappos.com ‘About page’ outlines their "Family Core Values", the first three being "Deliver WOW Through Service, Embrace and Drive Change & Create Fun and A Little Weirdness" – all reflected externally in my interaction with them.  Strong shared purpose enables higher product and service experience, equating to a dedicated customer base, repeat purchases and expanded marketshare. Q: Have you seen any trends in the market regarding employee engagement? A: Some companies now see offering a form of social engagement similar to Facebook and LinkedIn as standard communication infrastructure like email or instant messaging.  Originally offered as standalone tools, the value is now seen when these capabilities are offered in an integrated fashion in the context of business entities.  An emerging area of focus is around employee activities related to their organization on external social platforms, implicitly creating external communities with employees acting on behalf of the brand and interacting with each other (e.g. Twitter).  Companies have reached a formal understand that this now established communication medium requires strategies allowing employees to engage.  I have personally met colleagues from Oracle, like Oracle User Experience Director Ultan O'Broin (@ultan), via Twitter before meeting first through internal channels. Q: Employee engagement is important, but what about engaging customers and partners? A: The last few years we have witnessed an interesting evolution from the novelty of self-service to expectations of "intelligent" self-service.  From a consumer standpoint, engagement can end up being a key differentiator, especially in mature markets.  Customers that perform some level of interaction with a brand develop greater affinity for the brand and have a greater probability of acting as an advocate.  As organizations move toward a model of deeper engagement, they must ensure that their business is positioned to support deeper relationships, offering potentially greater transparency. From a partner standpoint greater engagement can lead to new types of business opportunities, much in the way that Amazon.com offers a unified shopping experience that can potentially span various vendors.  This same model can be extended to blending services and product delivery models, based on a closeness not easily possible before increased capability of engagement mechanisms. Q: What types of solutions are available to successfully deliver employee engagement? A: Solutions enabling higher levels of engagement do so on the basis of relevancy.  This relevancy is generally supported by aspects of content management, social collaboration, business intelligence, portal and process management technologies.  These technologies can help deliver an experience tailored to a given role or process within an organization that applies equally to work that is structured or unstructured, appearing in the form of functionality as simple as an online employee directory search, knowledge communities supported by social collaboration, as well as more feature rich business intelligence dashboards and portals. Looking to learn more about how to effectively engage your employees? Check out this webcast, or read more from John Brunswick. 

    Read the article

  • drupal what if we have designed a content type with existing fields

    - by rakeshakurathi
    i have small problem.... i have one content type say cars with various fields, say more than 30 , user can create the content types... now i would like to show only few fields in different phases,is there any possibility to do that. more explantaion:- user may enter the car model and car details in the first page and upload images in second page.(say a popup in the block) is this is possible ? i m newbie to drupal, i would like to do this kind of data updation, i though with designing a one more content type with the existing fields, can any one explain this issue... what if i design a content type car1 with same fields say(file uplaod) in car content type.

    Read the article

  • Exception when programmatically copying content type

    - by BeraCim
    Hi all: I'm getting exceptions when copying content type from one web to another: foreach (SPContentType destinationWebCt in destinationWeb.ContentTypes) { destinationWeb.ContentTypes.Add(existingWebCt); destinationWeb.Update(); } existingWebCt is the content type from another web e.g. /Site/Web. destinationWeb is the web that I wish to copy the content type to e.g. /Site/DestinationWeb. I got the SPException that says something like could not copy content type to scope. Then I decided to replace all "ContentTypes" to "AvailableContentTypes", but then I got the SPException saying this collcetion could not be modified. So how can I copy a content type to another web? Thanks.

    Read the article

  • Communicate between content script and options page

    - by Gaurang Tandon
    I have seen many questions already and all are about background page to content script. Summary My extension has an options page, and a content script. The content script handles the storage functionality (chrome.storage manipulation). Whenever, a user changes a setting in the options page, I want to send a message to the content script to store the new data. My code: options.js var data = "abcd"; // let data chrome.tabs.query({ active: true, currentWindow: true }, function (tabs) { chrome.tabs.sendMessage(tabs[0].id, "storeData:" + data, function(response){ console.log(response); // gives undefined :( }); }); content script js chrome.runtime.onMessage.addListener(function(request, sender, sendResponse) { // not working }); My question: Why isn't the approach not working? Is there any other (better) approach for this procedure.

    Read the article

  • Exception when programmatically copying content type

    - by BeraCim
    Hi all: I'm getting exceptions when copying content type from one web to another: foreach (SPContentType destinationWebCt in destinationWeb.ContentTypes) { destinationWeb.ContentTypes.Add(existingWebCt); destinationWeb.Update(); } existingWebCt is the content type from another web e.g. /Site/Web. destinationWeb is the web that I wish to copy the content type to e.g. /Site/DestinationWeb. I got the SPException that says something like content type can not be added outside of its scope. Then I decided to replace all "ContentTypes" to "AvailableContentTypes", but then I got the SPException saying this collcetion could not be modified. So how can I copy a content type to another web? Thanks.

    Read the article

  • 2 sites each in a different country with 1 set of content (cloaking)

    - by Greg
    Hi, I have a question re: cloaking. I have a friend who has a business in Canada and the UK. Currently the .ca site is hosted on Godaddy. The co.uk domain is registered (with uk ip address) with domainmonster and is using a cloaked/framed redirect to the .ca site. As a result (my assumption) the .ca site is indexed fine by google, the .co.uk is not. The content is generic for both sites. How do I point the .co.uk site directly to the content independently (preferably without duplicating the content hosting in the UK), so that for instance if the .ca domain was taken away altogether the .co.uk domain would remain an entity in itself from Google's point of view? Does Google index a generic set of content and then associate different country domains with that content? I hope I have explained this ok. Thanks, Greg

    Read the article

  • Access &lt;body element from content page via a nested master page

    - by danwellman
    All I want to do is access the <body element from the code-behind of a content page and add a class name to it. I have a top-level master page with the <body element in it. Then I have a nested master page which is the master page for the content page. From the code behind of the content page I want to add a class name to the body element. That's all. I have this in the top-level master: <body id="bodyNode" runat="server"> I added this to the code-behind for the content page: Master.bodyNode.Attributes.add("class", "home-page"); And I get a message that: System.Web.UI.MasterPage' does not contain a definition for 'bodyNode If I add this to the aspx content page: <% @ MasterType VirtualPath="~/MasterPage.master"%> The message then changes to: bodyNode is inaccessible due to its protection level Please advise, I've wasted like 2 hours on what feels like something that should be really simple to do :(

    Read the article

  • jQuery Dialog open hides background content in IE7

    - by JoshReedSchramm
    Hey all, I'm working on an application where the page layout is essentially a bunch of absolutely positioned containers. There is a header, sidebar and content area. The content area has its own header and footer which are fixed on the screen. The main part of the content area scrolls but nothing else does. So, we are using jQuery dialog for popup boxes and in IE7 with this new layout when i open a dialog the rest of my content in the background vanishes. I'm left with the page header and the footer of the content area and that's it. In Firefox and believe it or not IE6 it works fine. We are using ie7.js in ie6 though to make things not suck. Anyone have any clue why opening a dialog in jquery would screw up absolute positioning and make stuff vanish?

    Read the article

  • android : Dynamically changing the content of tab

    - by Jomia
    I want to change the content of a tab? when tha tab is created i set the content of the tab by setContent() method. But if I click again, I want to change the content that means change to another activity. I used setOnTabChangedListener() method, but I am not sure about how to set the content to another intent? Resources res = getResources(); TabHost tabHost=getTabHost(); tabHost.addTab(tabHost.newTabSpec("tab1").setIndicator("HOME").setContent(new Intent(getBaseContext(),homeGroup.class))); tabHost.addTab(tabHost.newTabSpec("tab2").setIndicator("ABOUT US").setContent(new Intent(getBaseContext(),aboutusGroup.class))); tabHost.setCurrentTab(0); tabHost.setOnTabChangedListener(new OnTabChangeListener() { @Override public void onTabChanged(String tabId) { //here i want to set the content of each tab to another intent // for 'tab1', change to home.class // for 'tab2', change to aboutus.class //how to set these? } }); Please help me.. Thank you..

    Read the article

  • Loading external content with jquery or iframe?

    - by nailuenlue
    Hiho, There's an existing website that i need to include into another site which goes like this: a.mysite.com and i need to fetch content from this site in my www.mysite.com website... As i need to access the content of the iframe the Same origin policy produces a problem here. What i did was to configure mod_proxy on Apache to proxy pass all requests from www.mysite.com/a to a.mysite.com This will work fine...but my problem is that im not sure what the best way would be to include those pages. 1. Idea As the content of the iframe is a full featured site with a top navigation...left navigation etc....i would need to change the page template to only show the content box to be able to integrate that page in the iframe. 2. Idea I could just load the DIV where the content lies through JQuery.load() and integrate it into my site. What is the best way to accomplish such a task? How bad is both ideas from the SEO point of view?

    Read the article

  • jQuery reader() on downloaded content.

    - by David
    Hi all! jQuery.ready() allows us to wait for the construction of the webpage. Recently it has been added support to wait until CSS files are loaded. I would like to know if that feature can be used for downloaded content, because I fetch content via $.ajax() that holds CSS references and I would like to retrieve the content of the CSS before working with the retrieved content. Fetch with $.ajax() the html. -- Wait until all CSS is downloaded. Show to fetched content (already css'ed). Thank you very much for your help.

    Read the article

  • ASP.NET GZip Encoding Caveats

    - by Rick Strahl
    GZip encoding in ASP.NET is pretty easy to accomplish using the built-in GZipStream and DeflateStream classes and applying them to the Response.Filter property.  While applying GZip and Deflate behavior is pretty easy there are a few caveats that you have watch out for as I found out today for myself with an application that was throwing up some garbage data. But before looking at caveats let’s review GZip implementation for ASP.NET. ASP.NET GZip/Deflate Basics Response filters basically are applied to the Response.OutputStream and transform it as data is written to it through the ASP.NET Response object. So a Response.Write eventually gets written into the output stream which if a filter is also written through the filter stream’s interface. To perform the actual GZip (and Deflate) encoding typically used by Web pages .NET includes the GZipStream and DeflateStream stream classes which can be readily assigned to the Repsonse.OutputStream. With these two stream classes in place it’s almost trivially easy to create a couple of reusable methods that allow you to compress your HTTP output. In my standard WebUtils utility class (from the West Wind West Wind Web Toolkit) created two static utility methods – IsGZipSupported and GZipEncodePage – that check whether the client supports GZip encoding and then actually encodes the current output (note that although the method includes ‘Page’ in its name this code will work with any ASP.NET output). /// <summary> /// Determines if GZip is supported /// </summary> /// <returns></returns> public static bool IsGZipSupported() { string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"]; if (!string.IsNullOrEmpty(AcceptEncoding) && (AcceptEncoding.Contains("gzip") || AcceptEncoding.Contains("deflate"))) return true; return false; } /// <summary> /// Sets up the current page or handler to use GZip through a Response.Filter /// IMPORTANT: /// You have to call this method before any output is generated! /// </summary> public static void GZipEncodePage() { HttpResponse Response = HttpContext.Current.Response; if (IsGZipSupported()) { string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"]; if (AcceptEncoding.Contains("deflate")) { Response.Filter = new System.IO.Compression.DeflateStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); Response.Headers.Remove("Content-Encoding"); Response.AppendHeader("Content-Encoding", "deflate"); } else { Response.Filter = new System.IO.Compression.GZipStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); Response.Headers.Remove("Content-Encoding"); Response.AppendHeader("Content-Encoding", "gzip"); } } } As you can see the actual assignment of the Filter is as simple as: Response.Filter = new DeflateStream(Response.Filter, System.IO.Compression.CompressionMode.Compress); which applies the filter to the OutputStream. You also need to ensure that your response reflects the new GZip or Deflate encoding and ensure that any pages that are cached in Proxy servers can differentiate between pages that were encoded with the various different encodings (or no encoding). To use this utility function now is trivially easy: In any ASP.NET code that wants to compress its Response output you simply use: protected void Page_Load(object sender, EventArgs e) { WebUtils.GZipEncodePage(); Entry = WebLogFactory.GetEntry(); var entries = Entry.GetLastEntries(App.Configuration.ShowEntryCount, "pk,Title,SafeTitle,Body,Entered,Feedback,Location,ShowTopAd", "TEntries"); if (entries == null) throw new ApplicationException("Couldn't load WebLog Entries: " + Entry.ErrorMessage); this.repEntries.DataSource = entries; this.repEntries.DataBind(); } Here I use an ASP.NET page, but the above WebUtils.GZipEncode() method call will work in any ASP.NET application type including HTTP Handlers. The only requirement is that the filter needs to be applied before any other output is sent to the OutputStream. For example, in my CallbackHandler service implementation by default output over a certain size is GZip encoded. The output that is generated is JSON or XML and if the output is over 5k in size I apply WebUtils.GZipEncode(): if (sbOutput.Length > GZIP_ENCODE_TRESHOLD) WebUtils.GZipEncodePage(); Response.ContentType = ControlResources.STR_JsonContentType; HttpContext.Current.Response.Write(sbOutput.ToString()); Ok, so you probably get the idea: Encoding GZip/Deflate content is pretty easy. Hold on there Hoss –Watch your Caching Or is it? There are a few caveats that you need to watch out for when dealing with GZip content. The fist issue is that you need to deal with the fact that some clients don’t support GZip or Deflate content. Most modern browsers support it, but if you have a programmatic Http client accessing your content GZip/Deflate support is by no means guaranteed. For example, WinInet Http clients don’t support GZip out of the box – it has to be explicitly implemented. Other low level HTTP clients on other platforms too don’t support GZip out of the box. The problem is that your application, your Web Server and Proxy Servers on the Internet might be caching your generated content. If you return content with GZip once and then again without, either caching is not applied or worse the wrong type of content is returned back to the client from a cache or proxy. The result is an unreadable response for *some clients* which is also very hard to debug and fix once in production. You already saw the issue of Proxy servers addressed in the GZipEncodePage() function: // Allow proxy servers to cache encoded and unencoded versions separately Response.AppendHeader("Vary", "Content-Encoding"); This ensures that any Proxy servers also check for the Content-Encoding HTTP Header to cache their content – not just the URL. The same thing applies if you do OutputCaching in your own ASP.NET code. If you generate output for GZip on an OutputCached page the GZipped content will be cached (either by ASP.NET’s cache or in some cases by the IIS Kernel Cache). But what if the next client doesn’t support GZip? She’ll get served a cached GZip page that won’t decode and she’ll get a page full of garbage. Wholly undesirable. To fix this you need to add some custom OutputCache rules by way of the GetVaryByCustom() HttpApplication method in your global_ASAX file: public override string GetVaryByCustomString(HttpContext context, string custom) { // Override Caching for compression if (custom == "GZIP") { string acceptEncoding = HttpContext.Current.Response.Headers["Content-Encoding"]; if (string.IsNullOrEmpty(acceptEncoding)) return ""; else if (acceptEncoding.Contains("gzip")) return "GZIP"; else if (acceptEncoding.Contains("deflate")) return "DEFLATE"; return ""; } return base.GetVaryByCustomString(context, custom); } In a page that use Output caching you then specify: <%@ OutputCache Duration="180" VaryByParam="none" VaryByCustom="GZIP" %> To use that custom rule. It’s all Fun and Games until ASP.NET throws an Error Ok, so you’re up and running with GZip, you have your caching squared away and your pages that you are applying it to are jamming along. Then BOOM, something strange happens and you get a lovely garbled page that look like this: Lovely isn’t it? What’s happened here is that I have WebUtils.GZipEncode() applied to my page, but there’s an error in the page. The error falls back to the ASP.NET error handler and the error handler removes all existing output (good) and removes all the custom HTTP headers I’ve set manually (usually good, but very bad here). Since I applied the Response.Filter (via GZipEncode) the output is now GZip encoded, but ASP.NET has removed my Content-Encoding header, so the browser receives the GZip encoded content without a notification that it is encoded as GZip. The result is binary output. Here’s what Fiddler says about the raw HTTP header output when an error occurs when GZip encoding was applied: HTTP/1.1 500 Internal Server Error Cache-Control: private Content-Type: text/html; charset=utf-8 Date: Sat, 30 Apr 2011 22:21:08 GMT Content-Length: 2138 Connection: close ?`I?%&/m?{J?J??t??` … binary output striped here Notice: no Content-Encoding header and that’s why we’re seeing this garbage. ASP.NET has stripped the Content-Encoding header but left our filter intact. So how do we fix this? In my applications I typically have a global Application_Error handler set up and in this case I’ve been using that. One thing that you can do in the Application_Error handler is explicitly clear out the Response.Filter and set it to null at the top: protected void Application_Error(object sender, EventArgs e) { // Remove any special filtering especially GZip filtering Response.Filter = null; … } And voila I get my Yellow Screen of Death or my custom generated error output back via uncompressed content. BTW, the same is true for Page level errors handled in Page_Error or ASP.NET MVC Error handling methods in a controller. Another and possibly even better solution is to check whether a filter is attached just before the headers are sent to the client as pointed out by Adam Schroeder in the comments: protected void Application_PreSendRequestHeaders() { // ensure that if GZip/Deflate Encoding is applied that headers are set // also works when error occurs if filters are still active HttpResponse response = HttpContext.Current.Response; if (response.Filter is GZipStream && response.Headers["Content-encoding"] != "gzip") response.AppendHeader("Content-encoding", "gzip"); else if (response.Filter is DeflateStream && response.Headers["Content-encoding"] != "deflate") response.AppendHeader("Content-encoding", "deflate"); } This uses the Application_PreSendRequestHeaders() pipeline event to check for compression encoding in a filter and adjusts the content accordingly. This is actually a better solution since this is generic – it’ll work regardless of how the content is cleaned up. For example, an error Response.Redirect() or short error display might get changed and the filter not cleared and this code actually handles that. Sweet, thanks Adam. It’s unfortunate that ASP.NET doesn’t natively clear out Response.Filters when an error occurs just as it clears the Response and Headers. I can’t see where leaving a Filter in place in an error situation would make any sense, but hey - this is what it is and it’s easy enough to fix as long as you know where to look. Riiiight! IIS and GZip I should also mention that IIS 7 includes good support for compression natively. If you can defer encoding to let IIS perform it for you rather than doing it in your code by all means you should do it! Especially any static or semi-dynamic content that can be made static should be using IIS built-in compression. Dynamic caching is also supported but is a bit more tricky to judge in terms of performance and footprint. John Forsyth has a great article on the benefits and drawbacks of IIS 7 compression which gives some detailed performance comparisons and impact reviews. I’ll post another entry next with some more info on IIS compression since information on it seems to be a bit hard to come by. Related Content Built-in GZip/Deflate Compression in IIS 7.x HttpWebRequest and GZip Responses © Rick Strahl, West Wind Technologies, 2005-2011Posted in ASP.NET   IIS7  

    Read the article

< Previous Page | 42 43 44 45 46 47 48 49 50 51 52 53  | Next Page >