Search Results

Search found 27294 results on 1092 pages for 'site deployment'.

Page 106/1092 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • How can I obfuscate a dll when using a Visual Studio deployment project?

    - by LeeW
    Hi all, I need to obfuscate a dll that is used in a ASP.NET project, the deployment project pruduces a setup.exe which I want to distribute. I have the VS 2008 Dotfuscator installed but when I build the deployment project the project that creates the dll is rebuilt before it is added to the deployment project and added to the setup.exe. Any suggestions on how I can get round this? Many thanks Lee

    Read the article

  • Problem with filefield module after migrating drupal site to a new server: cant upload files

    - by oalo
    We have a content type with two imagefield / filefield fields, and after migrating our site to a new server, we have the following problem: When we submit a new item for this content type, with two images for those fields, drupal gives us the following error and does not upload the images: warning: fopen(sites/default/files/.htaccess) [function.fopen]: failed to open stream: Permission denied in /websites/sitename/data/sites/all/modules/filefield/field_file.inc on line 349. warning: fopen(sites/default/files/.htaccess) [function.fopen]: failed to open stream: Permission denied in /websites/sitename/data/sites/all/modules/filefield/field_file.inc on line 349. An image thumbnail was not able to be created. warning: fopen(sites/default/files/.htaccess) [function.fopen]: failed to open stream: Permission denied in /websites/sitename/data/sites/all/modules/filefield/field_file.inc on line 349. warning: fopen(sites/default/files/.htaccess) [function.fopen]: failed to open stream: Permission denied in /websites/sitename/data/sites/all/modules/filefield/field_file.inc on line 349. An image thumbnail was not able to be created. I understand this is a permissions error, but it is not clear to me where do I have to change permissions. Line 349 of file.inc has the following code: if (($fp = fopen("$directory/.htaccess", 'w')) && fputs($fp, $htaccess_lines)) { fclose($fp); chmod($directory .'/.htaccess', 0664); } else { $repl = array('%directory' = $directory, '!htaccess' = nl2br(check_plain($htaccess_lines))); form_set_error($form_item, t("Security warning: Couldn't write .htaccess file. Please create a .htaccess file in your %directory directory which contains the following lines:!htaccess", $repl));

    Read the article

  • Prepare your site images for google image search indexing

    - by Vittorio Vittori
    Hi, I'm trying to understand how can I do to let my site be reachable from google image search spiders. I like how last.fm solution, and I thought to use a technique like his staff do to let google find artists images on their pages. When I'm looking for an artist and I search it on google image search, as often as not I find an image from last.fm artists page, I make an example: If I search the band Pure Reason Revolution It brings me here, the artist's image page http://www.last.fm/music/Pure+Reason+Revolution/+images/4284073 Now if I take a look to the image file, i can see it's named: http://userserve-ak.last.fm/serve/500/4284073/Pure+Reason+Revolution+4.jpg so if I try to undertand how the service works I can try to say: http://userserve-ak.last.fm/serve/ the server who serve the images 500/ the selected size for the image 4284073/ the image id for database Pure+Reason+Revolution+4.jpg the image name I thought it's difficult to think the real filename for the image is Pure+Reason+Revolution+4.jpg for image overwrite problems when an user upload it, in fact if I digit: http://userserve-ak.last.fm/serve/500/4284073.jpg I probably find the real image location and filename With this tecnique the image is highly reachable from search engines and easily archived. My question is, does exist some guide or tutorial to approach on this kind of tecniques, or something similar?

    Read the article

  • What is the benefit of using ONLY OpenID authentication on a site?

    - by Peter
    From my experience with OpenID, I see a number of significant downsides: Adds a Single Point of Failure to the site It is not a failure that can be fixed by the site even if detected. If the OpenID provider is down for three days, what recourse does the site have to allow its users to login and access the information they own? Takes a user to another sites content and every time they logon to your site Even if the OpenID provider does not have an error, the user is re-directed to their site to login. The login page has content and links. So there is a chance a user will actually be drawn away from the site to go down the Internet rabbit hole. Why would I want to send my users to another company's website? [ Note: my provider no longer does this and seems to have fixed this problem (for now).] Adds a non-trivial amount of time to the signup To sign up with the site a new user is forced to read a new standard, chose a provider, and signup. Standards are something that the technical people should agree to in order to make a user experience frictionless. They are not something that should be thrust on the users. It is a Phisher's Dream OpenID is incredibly insecure and stealing the person's ID as they log in is trivially easy. [ taken from David Arno's Answer below ] For all of the downside, the one upside is to allow users to have fewer logins on the Internet. If a site has opt-in for OpenID then users who want that feature can use it. What I would like to understand is: What benefit does a site get for making OpenID mandatory?

    Read the article

  • Making a jQuery plugin to feed Tumblr to site

    - by tylorreimer
    I have some experience with PHP and a little with JS but I'm far from anything proficient. I'm trying to make a jQuery plugin for my site that I can call in my HTML via something like this: $('.new').tumble({username: "tylor", count: 9}); Which would basically put the Tumblr list the code should make into the DIV with class 'new' in this case. Here is my code so far; the problem seems to be how to get it to pick up class/id from the original call (in the HTML) and use that in the jQuery. Here's the code so far: (function($) { $.fn.tumble = function(options){ var settings = $.extend({ username: null, // [string or array] required to get url for tumblr account count: 3, // [integer] how many posts to display? }, options); //url construction var url = "http://" + settings.username + ".tumblr.com"; var jsonurl = url + "/api/read/json?num=" + settings.count + "&callback=?"; $.getJSON(jsonurl, function(data) { var items = []; $.each(data.posts, function(id, url) { // Goes over each post in the JSON document retrieved from data URL var url = this.url; // Just assigns a variable to the url to avoid constantly writing "this.whatever" var photourl = this['photo-url-250']; // photo-url-xxx needs to be called this way due to integers in the name items.push('<li><a href="' + url + '">' + photourl + '</a></li>'); }); $('<ul/>', { // Creates an empty list html: items.join('') // Takes the values in the item array and puts 'em together }).appendTo('.new'); // I don't want this to have the class set in the jQuery itself }); //end json }; })( jQuery ); Any help you can lend would be wonderful. Thank you

    Read the article

  • Stopping cookies being set from a domain (aka "cookieless domain") to increase site performance

    - by Django Reinhardt
    I was reading in Google's documentation about improving site speed. One of their recommendations is serving static content (images, css, js, etc.) from a "cookieless domain": Static content, such as images, JS and CSS files, don't need to be accompanied by cookies, as there is no user interaction with these resources. You can decrease request latency by serving static resources from a domain that doesn't serve cookies. Google then says that the best way to do this is to buy a new domain and set it to point to your current one: To reserve a cookieless domain for serving static content, register a new domain name and configure your DNS database with a CNAME record that points the new domain to your existing domain A record. Configure your web server to serve static resources from the new domain, and do not allow any cookies to be set anywhere on this domain. In your web pages, reference the domain name in the URLs for the static resources. This is pretty straight forward stuff, except for the bit where it says to "configure your web server to serve static resources from the new domain, and do not allow any cookies to be set anywhere on this domain". From what I've read, there's no setting in IIS that allows you to say "serve static resources", so how do I prevent ASP.NET from setting cookies on this new domain? At present, even if I'm just requesting a .jpg from the new domain, it sets a cookie on my browser, even though our application's cookies are set to our old domain. For example, ASP.NET sets an ".ASPXANONYMOUS" cookie that (as far as I'm aware) we're not telling it to do. Apologies if this is a real newb question, I'm new at this! Thanks.

    Read the article

  • position of View on asp.net mvc site master page

    - by ognjenb
    How fix data table to open only in Main Content Frame? Structure of my site.master page is: left content, main content and right content. When open View page in main content she goes to the right content if it is large. Is this CSS problem? My problem is similar to this http://www.inq.me/post/ASPNet-MVC-Extension-method-to-create-a-Security-Aware-HtmlActionLink.aspx This is my CSS(come with template): /*~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ PRIMARY LAYOUT STYLES ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~*/ .content-container { position:relative; _height:1px; min-height:1px; width:900px; /* background:url(images/bg-column-left.png) repeat-y;*/ } .content-container-inner { /*background:url(images/bg-column-right.png) repeat-y right;*/ _height:1px; min-height:1px; /*padding:0 200px;*/ position:relative; /*width:900px;*/ } .content-main { padding :15px 0% 0px 2%; /*position:relative;*/ min-height:1px; _height:1px; float:left; position:relative; /*width:96%;*/ /*width:900px;*/ } .content-left { padding:20px 10px; float:left; width:180px; margin-top:-1px; position:relative; margin-left:-100%; right:200px; _left:200px; border-top:1px dotted #797979; } .content-right { padding :15px 10px 10px 10px; float:left; width:160px; position:relative; margin-right:-200px; } .ads { text-align:center; margin:20px 0; } /*~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    Read the article

  • what is the best way to optimize my json on an asp.net-mvc site

    - by ooo
    i am currently using jqgrid on an asp.net mvc site and we have a pretty slow network (internal application) and it seems to be taking the grid a long time to load (the issue is both network as well as parsing, rendering) I am trying to determine how to minimized what i send over to the client to make it as fast as possible. Here is a simplified view of my controller action to load data into the grid: [AcceptVerbs(HttpVerbs.Get)] public ActionResult GridData1(GridData args) { var paginatedData = applications.GridPaginate(args.page ?? 1, args.rows ?? 10, i => new { i.Id, Name = "<div class='showDescription' id= '" + i.id+ "'>" + i.Name + "</div>", MyValue = GetImageUrl(_map, i.value, "star"), ExternalId = string.Format("<a href=\"{0}\" target=\"_blank\">{1}</a>", Url.Action("Link", "Order", new { id = i.id }), i.Id), i.Target, i.Owner, EndDate = i.EndDate, Updated = "<div class='showView' aitId= '" + i.AitId + "'>" + GetImage(i.EndDateColumn, "star") + "</div>", }) return Json(paginatedData); } So i am building up a json data (i have about 200 records of the above) and sending it back to the GUI to put in the jqgrid. The one thing i can thihk of is Repeated data. In some of the json fields i am appending HTML on top of the raw "data". This is the same HTML on every record. It seems like it would be more efficient if i could just send the data and "append" the HTML around it on the client side. Is this possible? Then i would just be sending the actual data over the wire and have the client side add on the rest of the HTML tags (the divs, etc) be put together. Also, if there are any other suggestions on how i can minimize the size of my messages, that would be great. I guess at some point these solution will increase the client side load but it may be worth it to cut down on network traffic.

    Read the article

  • how to get contents of site use HTTPS

    - by cashmoney
    ex of site using ssl ( HTTPs ) : https://www.eb2a.com 1 - i tried to get its content using file_get_contents, but not work and give error ex : <?php $contents = file_get_contents("https://www.eb2a.com/"); echo $contents; ?> 2 - i tried to use fopen, but not work and give error ex: <?php $url = 'https://www.eb2a.com/'; $contents = fopen($url, 'r'); echo "$contents"; ?> 3 - i tried to use CURL, but not work and give BLANK PAGE ex : function cURL($url, $ref, $header, $cookie, $p){ $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0); curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']); curl_setopt($ch, CURLOPT_REFERER, $ref); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0); if ($p) { curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "POST"); curl_setopt($ch, CURLOPT_POST, 1); curl_setopt($ch, CURLOPT_POSTFIELDS, $p); } $result = curl_exec($ch); curl_close($ch); if ($result){ return $result; }else{ return ''; } } $file = cURL('https://www.eb2a.com/','https://www.eb2a.com/',0,0,null); echo $file any one have any idea ??

    Read the article

  • Re-authentication required for registered-path links (to ASP.NET site) coming to IE from PowerPoint

    - by Daniel Halsey
    We're using URL routing based on Phil Haack's example, with config modifications based on MSDN Library article #CC668202, to provide "shareable" links for a ASP.NET forms site, and have run into a strange issue: For users attempting to open links from PowerPoint presentations, and who have IE set as their default browser, using one of these links forces (forms-based) re-authentication, even in the same browser instance with a live session. Info: We know the session is still alive. (Page returns information for the currently logged-in user; confirmed via debug watches) This doesn't happen with other browsers (FF, Chrome) or with other programs (Notepad++) as the URL source. We do not have a default path set, as this caused issues with root path handling at initial login. This primarily happens with PowerPoint, but will also happen in Word and OCS. On some machines, even after changing the default browser, Office apps will continue to use IE for these links, forcing this error. (A potential registry fix for this failed, but even if it had worked, we can't control default browser choice for our users.) We can't figure out if this is an Office oddity or is being caused by our decision to use app-level URL routing (rather than IIS rewriting). Has anyone else encountered this and found a solution?

    Read the article

  • Running Sitecore Production Site under a Virtual Directory

    - by danswain
    We are using Sitecore 6 on a Windows Server 2003 (32bit) dev machine. I know it's not recommended for the CMS editing site, but we've been told it is possible to get the front-end Sitecore websites to run from within a virtual directory. Here's the issue: we'd like to achieve what the below poor mans diagram shows. We have a website (.net 1.1) /WebSiteRoot (.net 1.1) | | |---- Custom .net 1.1 Web Application | |---- SiteCore frontend WebApplication (.net 2.0) | |---- Custom .net 2.0 WebApplication The Sitecore webApplication would contain the Sitecore pipeline in its web.config and we'd make use of the section to configure the virtual folder to allow for where our Sitecore app sits and point it to the appropriate place in the Content Tree. Is it possible to pull this off? This is just the customer facing website, there will be no CMS editing functionality on these servers, that will be done from a more standard Sitecore install inside the firewall on a different server. The errors we're encountering are centered around loading the the various config files in the App_Config folder. It seems to do a Server.MapPath on "/" initially (which is wrong for us) so we've tried putting absolute paths in the web.config and still no joy (I think there must be some hardcoded piece that looks for the Include directory). Any help would be greatly appreciated. Thanks

    Read the article

  • How do I retrieve twitter xml for Flash site via php properly

    - by daidai
    Am I using TwitterScript to retrieve Twitter data for inside a Flash site. Due to Twitter's crossdomain policy, I need to setup a php proxy... Firstly I made a simple one <?php $url = $_GET['url']; readfile($url); ?> but I then get this error URL file-access is disabled in the server configuration which is only resolved by getting my host to turn fopen() on, which I don't want to do. Then I found this <?php function get_content($url) { $ch = curl_init(); curl_setopt ($ch, CURLOPT_URL, $url); curl_setopt ($ch, CURLOPT_HEADER, 0); ob_start(); curl_exec ($ch); curl_close ($ch); $string = ob_get_contents(); ob_end_clean(); return $string; } #usage: $url = $_GET['url']; $content = get_content ($url); var_dump ($content); ?> which solves that problem but the data now is the correct XML but looks like: string(39950) "<?xml version="1.0" encoding="UTF-8"?> <statuses type="array"> <status> ... </statuses>" How do I get the XML data out of that string?

    Read the article

  • Cannot override the CSS at this site

    - by gdanko
    This site is overriding my CSS with its own and I cannot get around it! It has style.css with "text-align: center" in the body. I have <div id="mydiv"> appended to the body and it's normally got "text-align: left". There are <ul>s and <li>s underneath #mydiv and they are inheriting the body's 'center' for some reason. I tried this and it's still not working. $('#mydiv').children().css('text-align', 'auto'); How the heck do I reclaim my CSS!? @Grillz, the HTML looks like this: <div id="mydiv"> <ul class="container"> <li rel="folder" class="category"><a href="#">category1</a> <ul><li rel="file" class="subcategory"><a href="#">subcategory1</a></li></ul> <ul><li rel="file" class="subcategory"><a href="#">subcategory2</a></li></ul> </li> <li rel="folder" class="category"><a href="#">category2</a> <ul><li rel="file" class="subcategory"><a href="#">subcategory3</a></li></ul> <ul><li rel="file" class="subcategory"><a href="#">subcategory4</a></li></ul> </li> </ul>

    Read the article

  • Redirect To Another Site With Header Information Attached Javascript

    - by Nick LaMarca
    I am trying to make a client side click and redirect to another site with header information added, my client side code for the onclick is this: function selectApp(appGUID, userId ,embedUrl) { if(embedUrl==="") { var success = setAppGUID(appGUID); window.location.replace('AppDetail.aspx'); } else { $.ajax({ type: "POST", url: embedUrl, contentType: "text/html", beforeSend: function (xhr, settings) { xhr.setRequestHeader("UserId", userId); }, success: function (msg) { //go to slx window.location.replace(embedUrl); } }); } } And the server side code in "embedUrl" is protected void Page_Load(object sender, EventArgs e) { string isSet = (String)HttpContext.Current.Session["saveUserID"]; if (String.IsNullOrEmpty(isSet)) { NameValueCollection headers = base.Request.Headers; for (int i = 0; i < headers.Count; i++) { if (headers.GetKey(i).Equals("UserId")) { HttpContext.Current.Session["saveUserID"] = headers.Get(i); } } } else { TextBox1.Text = HttpContext.Current.Session["saveUserID"].ToString(); } } This seems to work but its not too elegant. Is there a way to redirect with header data? Without (what Im doing) Saving header info in a session var then doing a redirect in 2 seperate pieces.

    Read the article

  • boost::function & boost::lambda - call site invocation & accessing _1 and _2 as the type

    - by John Dibling
    Sorry for the confusing title. Let me explain via code: #include <string> #include <boost\function.hpp> #include <boost\lambda\lambda.hpp> #include <iostream> int main() { using namespace boost::lambda; boost::function<std::string(std::string, std::string)> f = _1.append(_2); std::string s = f("Hello", "There"); std::cout << s; return 0; } I'm trying to use function to create a function that uses the labda expressions to create a new return value, and invoke that function at the call site, s = f("Hello", "There"); When I compile this, I get: 1>------ Build started: Project: hacks, Configuration: Debug x64 ------ 1>Compiling... 1>main.cpp 1>.\main.cpp(11) : error C2039: 'append' : is not a member of 'boost::lambda::lambda_functor<T>' 1> with 1> [ 1> T=boost::lambda::placeholder<1> 1> ] Using MSVC 9. My fundamental understanding of function and lambdas may be lacking. The tutorials and docs did not help so far this morning. How do I do what I'm trying to do?

    Read the article

  • PHP site scheduling Java execution?

    - by obfuscation
    I'm trying to get started on combining my (slightly limited) PHP experience with my (better) Java experience, in a project where I need to allow uploads of Java source files to the server, which the server then executes Javac on to compile it. Then, at a set time (e.g. specified on upload) I need to run that once on the server, which will generate some database info for the PHP site to display. To describe my current programming abilities- I have made many desktop Java programs, and am confident in 'pure' Java, but so far have only undertaken a couple of PHP projects (including using the CodeIgniter framework). My motivation for using PHP as the frontend is because I know it is very fast, lightweight and I will be able to display the results I need very easily with it (simple DB readout). Ideally, the technology used should be able to be developed on a localhost (e.g. WAMP, Tomcat etc..) Is there any advice which you could give on what technology I should consider to use to bridge this gap, and what resources could help in using that technology? I have looked at a few, but have struggled to find documentation helping in achieving what I need.

    Read the article

  • ajax panel update during the middle of a function C# ASP.net site

    - by user2615302
    ajax panel update during the middle of a function C# ASP.net site This is the button click. I would like to update LbError.Text to "" before the rest of the function continues. This is my current code. protected void BUpload_Click(object sender, EventArgs e) { LbError.Text = ""; UpdatePanel1.Update(); //// need it to update here before it moves on but it waits till the end to update the lablel Exicute functions..... ....... ....... } <asp:UpdatePanel ID="UpdatePanel1" runat="server" UpdateMode="Conditional"> <contenttemplate> <asp:Label ID="LbError" runat="server" CssClass="failureNotification" Text=""></asp:Label> <br /><br /> <br /><br /> <asp:TextBox ID="NewData" runat="server"></asp:TextBox><br /> then click <asp:Button ID="BUpload" runat="server" Text="Upload New Data" onclick="BUpload_Click"/><br /> Things i have tried include have another UpdatePanel just and same results. any help would be greatly appreciated.

    Read the article

  • Content Query Web Part and the Yes/No Field

    - by Bil Simser
    The Content Query Web Part (CQWP) is a pretty powerful beast. It allows you to do multiple site queries and aggregate the results. This is great for rolling up content and doing some summary type reporting. Here’s a trick to remember about Yes/No fields and using the CQWP. If you’re building a news style site and want to aggregate say all the announcements that people tag a certain way, up onto the home page this might be a solution. First we need to allow a way for users of all our sites to mark an announcement for inclusion on our Intranet Home Page. We’ll do this by just modifying the Announcement Content type and adding a Yes/No field to it. There are alternate ways of doing this like building a new Announcement type or stapling a feature to all sites to add our column but this is pretty low impact and only affects our current site collection so let’s go with it for now, okay? You can berate me in the comments about the proper way I should have done this part. Go to the Site Settings for the Site Collection and click on Site Content Types under the Galleries. This takes you to the gallery for this site and all subsites. Scroll down until you see the List Content Types and click on Announcements. Now we’re modifying the Announcement content type which affects all those announcement lists that are created by default if you’re building sites using the Team Site template (or creating a new Announcements list on any site for that matter). Click on Add from new site column under the Column list. This will allow us to create a new Yes/No field that users will see in Announcement items. This field will allow the user to flag the announcement for inclusion on the home page. Feel free to modify the fields as you see fit for your environment, this is just an example. Now that we’ve added the column to our Announcements Content type we can go into any site that has an announcement list, modify that announcement and flag it to be included on our home page. See the new Featured column? That was the result of modifying our Announcements Content Type on this site collection. Now we can move onto the dirty part, displaying it in a CQWP on the home page. And here is where the fun begins (and the head scratching should end). On our home page we want to drop a Content Query Web Part and aggregate any Announcement that’s been flagged as Featured by the users (we could also add the filter to handle Expires so we don’t show old content so go ahead and do that if you want). First add a CQWP to the page then modify the settings for the web part. In the first section, Query, we want the List Type to be set to Announcements and the Content type to be Announcement so set your options like this: Click Apply and you’ll see the results display all Announcements from any site in the site collection. I have five team sites created each with a unique announcement added to them. Now comes the filtering. We don’t want to include every announcement, only ones users flag using that Featured column we added. At first blush you might scroll down to the Additional Filters part of the Query options and set the Featured column to be equal to Yes: This seems correct doesn’t it? After all, the column is a Yes/No column and looking at an announcement in the site, it displays the field as Yes or No: However after applying the filter you get this result: (I have the announcements from Team Site 1 and Team Site 4 flagged as Featured) Huh? It’s BACKWARDS! Let’s confirm that. Go back in and change the Additional Filters section from Yes to No and hit Apply and you get this: Wait a minute? Shouldn’t I see Team Site 1 and 4 if the logic is backwards? Why am I seeing the same thing as before. What gives… For whatever reason, unknown to me, a Yes/No field (even though it displays as such) really uses 1 and 0 behind the scenes. Yeah, someone was stuck on using integer values for booleans when they wrote SharePoint (probably after a long night of white boarding ways to mess with developers heads) and came up with this. The solution is pretty simple but not very discoverable. Set the filter to include your flagged items like so: And it will filter the items marked as Featured correctly giving you this result: This kind of solution could also be extended and enhanced. Here are a few suggestions and ideas: Modify the ItemStyle.xsl file to add a new style for this aggregation which would include the first few paragraphs of the body (or perhaps add another field to the Content type called Excerpt or Summary and display that instead) Add an Image column to the Announcement Content type to include a Picture field and display it in the summary Add a Category choice field (Employee News, Current Events, Headlines, etc.) and add multiple CQWPs to the home page filtering each one on a different category I know some may find this topic old and dusty but I didn’t see a lot out there specifically on filtering the Yes/No fields and the whole 1/0 trick was a little wonky, so I figured a few pictures would help walk through overcoming yet another SharePoint weirdness. With a little work and some creative juices you can easily us the power of aggregation and the CQWP to build a news site from content on your team sites.

    Read the article

  • Example to get Facebook Events using sdk v4 from fan page into Wordpress site [on hold]

    - by Dorshin
    Been trying to update to the new FB php sdk v4 for pulling events into my website, but having trouble finding how to do it. I want to pull public event information from a FB "page" using their fan page ID number. For example, a venue that has multiple events. What are the minimal classes I need to "require_once" and "use" to only pull the events (don't need to login)? The site is on Wordpress which doesn't use sessions, so what do I do with the "session_start()" statement? Will it work anyway? Could I get a basic code example of how to get the event info into an array? (I want to make sure I get the syntax correct) So far I've got the below code, but it is not working. session_start(); require_once( 'Facebook/GraphObject.php' ); require_once( 'Facebook/GraphSessionInfo.php' ); require_once( 'Facebook/FacebookSession.php' ); require_once( 'Facebook/FacebookCurl.php' ); require_once( 'Facebook/FacebookHttpable.php' ); require_once( 'Facebook/FacebookCurlHttpClient.php' ); require_once( 'Facebook/FacebookResponse.php' ); require_once( 'Facebook/FacebookSDKException.php' ); require_once( 'Facebook/FacebookRequestException.php' ); require_once( 'Facebook/FacebookAuthorizationException.php' ); require_once( 'Facebook/FacebookRequest.php' ); require_once( 'Facebook/FacebookRedirectLoginHelper.php' ); use Facebook\GraphSessionInfo; use Facebook\FacebookSession; use Facebook\FacebookCurl; use Facebook\FacebookHttpable; use Facebook\FacebookCurlHttpClient; use Facebook\FacebookResponse; use Facebook\FacebookAuthorizationException; use Facebook\FacebookRequestException; use Facebook\FacebookRequest; use Facebook\FacebookSDKException; use Facebook\FacebookRedirectLoginHelper; use Facebook\GraphObject; function facebook_event_function() { FacebookSession::setDefaultApplication('11111111111','00000000000000000'); /* make the API call */ $request = new FacebookRequest($session, '/{123456789}/events','GET'); $response = $request->execute(); $graphObject = $response->getGraphObject(); } So far, not getting anything in the $graphObject and it's throwing this error as well: PHP Fatal error: Uncaught exception 'Facebook\FacebookAuthorizationException' with message '(#803) Some of the aliases you requested do not exist: v2.0GET' in ../Facebook/FacebookRequestException.php:134 After I get something in the $graphObject, I want to add the info to a DB table. This part I am OK on. Thank you for the help.

    Read the article

  • ASP.NET site move to IIS7 results in gibberish characters in page output

    - by frankadelic
    I have an ASP.NET site that was working fine running on Windows Server 2003 / IIS6. I moved it to Windows Server 2008 / IIS7 and the aspx page output now includes gibberish text. For example: p???? ????? The majority of the page renders properly, but there is gibberish here and there. I have checked the event logs and there is nothing. Any idea what's going on here? How can I fix this? I have noticed that this issue shows up when I include multiple Server.Execute statements in the aspx code: <% Server.Execute("/inc/top.inc"); %> <% Server.Execute("/inc/footer.inc"); %> The .inc files above contain just html. It appears that the files have to be of a significant length to cause the error. Here is the sample html I've been testing with: <div class="logo"> <a href="/"> <img src="/logo.png" alt="logo" width="31" height="29" class="logoimg" /> </a> </div> <div class="logo"> <a href="/"> <img src="/logo.png" alt="logo" width="31" height="29" class="logoimg" /> </a> </div> <div class="logo"> <a href="/"> <img src="/logo.png" alt="logo" width="31" height="29" class="logoimg" /> </a> </div> <div class="logo"> <a href="/"> <img src="/logo.png" alt="logo" width="31" height="29" class="logoimg" /> </a> </div> <div class="logo"> <a href="/"> <img src="/logo.png" alt="logo" width="31" height="29" class="logoimg" /> </a> </div> <div class="logo"> <a href="/"> <img src="/logo.png" alt="logo" width="31" height="29" class="logoimg" /> </a> </div> Also, the gibberish characters appear inconsistently. If I ctrl+F5 the pages, the gibberish characters change and occasionally don't appear at all.

    Read the article

  • Trying to read FormsAuthentication tickets to read in other areas of site

    - by Pasha Aryana
    Hi, NOTE: I have included 3 links in here to my localhost areas but could not submit the post so I seperetaed them with a space character so it would post on stackoverflow. I currently have 2 ASP.NET MVC apps in my solution. First I run the first one by setting it to be startup project. It goes to the login page, from there once the data has been entered I execute the following code: var authTicket = new FormsAuthenticationTicket(1, login.LoginDataContract.MSISDN, DateTime.Now, DateTime.Now.AddMinutes(Convert.ToDouble("30")), true, ""); string cookieContents = FormsAuthentication.Encrypt(authTicket); var cookie = new HttpCookie(FormsAuthentication.FormsCookieName, cookieContents) { Expires = authTicket.Expiration, //Path = FormsAuthentication.FormsCookiePath //Path = "http://localhost" Domain = "" }; if (System.Web.HttpContext.Current != null) { System.Web.HttpContext.Current.Response.Cookies.Add(cookie); } As you can see I have set the Domain = "", so theoretically speaking it should work on any thing under my http: //localhost. Then I have set the persist security of the cookie to true so I can access it from any where under localhost. The cookie writes fine and I get logged in and all godd for now. BTW the url for this login page is: http //localhost/MyAccount/Login Now then I stop the solution and set the other MVC apps to be the startup. Then I run it. The URL for the second site is: http: //localhost/WebActivations/ Here is the code in the other apps start controller: public class HomeController : Controller { public ActionResult Index() { ViewData["Message"] = "Welcome to ASP.NET MVC!"; // PASHA: Added code to read the authorization cookie set at // login in MyAccount *.sln for (int i = 0; i < System.Web.HttpContext.Current.Request.Cookies.Count;i++) { Response.Write(System.Web.HttpContext.Current.Request.Cookies[i].Name + " " + System.Web.HttpContext.Current.Request.Cookies[i].Value); } HttpCookie authorizationCookie = System.Web.HttpContext.Current.Request.Cookies[FormsAuthentication.FormsCookieName.ToString()]; // decrypt. FormsAuthenticationTicket authorizationForm = FormsAuthentication.Decrypt(authorizationCookie.Value); ViewData["Message"] = authorizationForm.UserData[0].ToString(); return View(); } public ActionResult About() { return View(); } The problem is in this Home controller when I run the solution it cannot read the authentication cookie, you see the loop there it does not find the .ASPXAUTH cookie. But once it crashes in Firefox I have a look in the Page Info and then security and Cookies and its there and its the same cookie. What am I doing wrong?

    Read the article

  • Debugging site written mainly in JScript with AJAX code injection

    - by blumidoo
    Hello, I have a legacy code to maintain and while trying to understand the logic behind the code, I have run into lots of annoying issues. The application is written mainly in Java Script, with extensive usage of jQuery + different plugins, especially Accordion. It creates a wizard-like flow, where client code for the next step is downloaded in the background by injecting a result of a remote AJAX request. It also uses callbacks a lot and pretty complicated "by convention" programming style (lots of events handlers are created on the fly based on certain object names - e.g. current page name, current step name). Adding to that, the code is very messy and there is no obvious inner structure - the functions are scattered in the code, file names do not reflect the business role of the code, lots of functions and code snippets are most likely not used at all etc. PROBLEM: How to approach this code base, so that the inner flow of the code can be sort-of "reverse engineered" using a suite of smart debugging tools. Ideally, I would like to be able to attach to the running application and step through the code, breaking on each new function call. Also, it would be nice to be able to create a "diagram of calls" in the application (i.e. in order to run a particular page logic, this particular flow of function calls was executed in a particular order). Not to mention to be able to run a coverage analysis, identifying potentially orphaned code fragments. I would like to stress out once more, that it is impossible to understand the inner logic of the application just by looking at the code itself, unless you have LOTS of spare time and beer crates, which I unfortunately do not have :/ (shame...) An IDE of some sort that would aid in extending that code would be also great, but I am currently looking into possibility to use Visual Studio 2010 to do the job, as the site itself is a mix of Classic ASP and ASP.NET (I'd say - 70% Java Script with jQuery, 30% ASP). I have obviously tried FireBug, but I was unable to find a way to define a breakpoint or step into the code, which is "injected" into the client JS using AJAX calls (i.e. the application retrieves the code by invoking an URL and injects it to the client local code). Venkman debugger had similar issues. Any hints would be welcome. Feel free to ask additional questions.

    Read the article

  • VB - Convert Web Site to Web Application

    - by Dave
    Hi This is my first time doing VB :-) I've inherited a web site, which I've converted into a web application in VS2008. The conversion has worked for everything except a Gallery control. The compile error I'm getting is: Type 'Gallery' is not defined in file: gallery_oct07.aspx.designer.vb Option Strict On Option Explicit On Partial Public Class gallery_oct07 '''<summary> '''Gallery1 control. '''</summary> '''<remarks> '''Auto-generated field. '''To modify move field declaration from designer file to code-behind file. '''</remarks> Protected WithEvents Gallery1 As Global.Gallery End Class with squiggly lines under Global.Gallery. The gallery_oct07.aspx.vb is: Partial Class gallery_oct07 Inherits System.Web.UI.Page End Class Gallery.ascx is: <%@ Control Language="C#" AutoEventWireup="true" Codebehind="Gallery.ascx.cs" Inherits="WebApplication1.Gallery"%> <asp:Repeater runat="server" ID="rptGallery"> <HeaderTemplate> <ul class='<%#CssClass%>'> </HeaderTemplate> <ItemTemplate> <li><a href='<%#ImageFolder + Eval("Name") %>' class="thickbox" rel="gallery"><img src='<%#ImageFolder + "thumb/" + Eval("Name") %>' /></a></li> </ItemTemplate> <FooterTemplate> </ul></FooterTemplate> </asp:Repeater> and the code behind is: using System; using System.IO; namespace WebApplication1 { public partial class Gallery : System.Web.UI.UserControl { public string _ImageFolder; public string ImageFolder { get { return _ImageFolder; } set { _ImageFolder = value; } } private string _cssClass = "gallery"; public string CssClass { get { return _cssClass; } set { _cssClass = value; } } protected void Page_Load(object sender, EventArgs e) { DirectoryInfo dir = new DirectoryInfo(MapPath(ImageFolder)); FileInfo[] images = dir.GetFiles("*.jpg"); rptGallery.DataSource = images; rptGallery.DataBind(); } protected void Page_PreRender(object sender, EventArgs e) { } } } The feels like a namespace issue.. My project namespace is WebApplication1. Cheers!

    Read the article

  • Linking to a section of a site that is hidden by a hide/show JavaScript function

    - by hollyb
    I am using a bit of JavaScript to show/hide sections of a site when a tab is clicked. I'm trying to figure out if there is a way I can link back to the page and have a certain tab open based on that link. Here is the JS: var ids=new Array('section1','section2','section3','section4'); function switchid(id, el){ hideallids(); showdiv(id); var li = el.parentNode.parentNode.childNodes[0]; while (li) { if (!li.tagName || li.tagName.toLowerCase() != "li") li = li.nextSibling; // skip the text node if (li) { li.className = ""; li = li.nextSibling; } } el.parentNode.className = "active"; } function hideallids(){ //loop through the array and hide each element by id for (var i=0;i<ids.length;i++){ hidediv(ids[i]); } } function hidediv(id) { //safe function to hide an element with a specified id document.getElementById(id).style.display = 'none'; } function showdiv(id) { //safe function to show an element with a specified id document.getElementById(id).style.display = 'block'; } And the HTML <ul> <li class="active"><a onclick="switchid('section1', this);return false;">One</a></li> <li><a onclick="switchid('section2', this);return false;">Two</a></li> <li><a onclick="switchid('section3', this);return false;">Three</a></li> <li><a onclick="switchid('section4', this);return false;">Four</a></li> </ul> <div id="section1" style="display:block;"> <div id="section2" style="display:none;"> <div id="section3" style="display:none;"> <div id="section4" style="display:none;"> I haven't been able to come up with a way to link back to a specific section. Is it even possible with this method? Thanks!

    Read the article

  • DNS servers not set properly? Site failing to load by hostname, though loads fine by IP

    - by Crashalot
    We run www.tekiki.com. Some users, including us, cannot reach www.tekiki.com because of DNS issues. The site resolves fine on the desktop, but it fails from our iPhones and iPads. This doesn't happen to everyone. We noticed the problem yesterday, then set our DNS servers to Cloudflare's DNS servers, hoping that would fix things. Accessing the site by IP addr loads the site fine. Two questions: 1) Does anyone know what the solution is? 2) Should we use other DNS servers besides Cloudflare?

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >