Search Results

Search found 37688 results on 1508 pages for 'site search'.

Page 348/1508 | < Previous Page | 344 345 346 347 348 349 350 351 352 353 354 355  | Next Page >

  • About lifecycle of activities

    - by janfsd
    In my application I have several activities, the main screen has 4 buttons that each start a different activity. So one of them is a search activity, once it searches it shows you a result activity. This result activity can be reached from other activities, so in general something like this: Main activity -> Search activity -> Result activity Main acitivty -> someother activity -> Result activity Now, if I have reached this result activity and press back once or twice, and after that press the Home key it will show the Home screen. But if I want to get back to my application by holding the Home button and clicking on my app it will always go back to the Result activity, no matter which activity was the last one I was using. And if I press again back it will take me back to the Home screen. If I try it again it will take me again to the Result activity. The only way to avoid this is to start the application by clicking on the app's icon. And this takes me to the last activity I was using and it remembers the state so if I press back again it doesn't take me to the Home screen, instead to the activity before it. To illustrate this: Main activity -> Search activity -> result activity --back--> Search activity --Home Button--> Home Screen --Hold Home and select the app --> Result activity --back--> Home Screen --Click application icon--> Search activity --back--> Main activity Another thing that happens is that if I press the Home button while on the Result activity, and start the app by clicking the icon, it will take me to the activity prior the the Result one. Why is this happening? Any workarounds?

    Read the article

  • pagination in fbjs/ajax

    - by fusion
    i've a search form in which i'm trying to implement pagination - getting the data through ajax. everything works out fine initially, except when i go to the next page or any of the links on the pagination. it gives me a page not found error. can anyone please point out what is wrong with my code? search.html <div class="search_wrapper"> <input type="text" name="query" id="query" class="txt_search" onkeyup="submitPage('http://website/name/search.php', 'txtHint', '1');" /> <input type="button" name="button" class="button_search" onclick="submitPage('http://website/name/search.php', 'txtHint', '1');" /> <p> <div id="txtHint"></div> </p> </div> search ajax.js: function submitPage(url, target_id, page) { // Retrieve element handles, and populate request parameters. var target = document.getElementById(target_id); if(typeof page == 'undefined') { page = 1; } // Set up an AJAX object. Typically, an FBML response is desired. document.getElementById(target_id).setInnerXHTML('<span id="caric"><center><img src="http://website/name/images/ajax-loader.gif" /></center></span>'); var ajax = new Ajax(); ajax.responseType = Ajax.FBML; ajax.requireLogin = true; ajax.ondone = function(data) { // When the FBML response is returned, populate the data into the target element. document.getElementById('caric').setStyle('display','none'); if (target) target.setInnerFBML(data); } ajax.onerror = function() { var msgdialog = new Dialog(); msgdialog.showMessage('Error', 'An error has occurred while trying to load.'); return false; } var params = { 'query' : document.getElementById('query').getValue() }; ajax.post(url, params, page); } search.php: $search_result = ""; if (isset($_POST["query"])) $search_result = trim($_POST["query"]); if(isset($_GET['page'])) $page = $_GET['page']; else $page = 1; ..... $self = $_SERVER['PHP_SELF']; $limit = 2; //Number of results per page $numpages=ceil($totalrows/$limit); $query = $query." ORDER BY idQuotes LIMIT " . ($page-1)*$limit . ",$limit"; $result = mysql_query($query, $conn) or die('Error:' .mysql_error()); ?> <div class="search_caption">Search Results</div> <div class="search_div"> <table> . . .display results </table> </div> <hr> <div class="searchmain"> <?php //Create and print the Navigation bar $nav=""; $next = $page+1; $prev = $page-1; if($page > 1) { $nav .= "<a onclick=\"submitPage('','','$prev'); return false;\" href=\"$self?page=" . $prev . "&q=" .urlencode($search_result) . "\">< Prev</a>"; $first = "<a onclick=\"submitPage('','','1'); return false;\" href=\"$self?page=1&q=" .urlencode($search_result) . "\"> << </a>" ; } else { $nav .= "&nbsp;"; $first = "&nbsp;"; } for($i = 1 ; $i <= $numpages ; $i++) { if($i == $page) { $nav .= "<span class=\"no_link\">$i</span>"; }else{ $nav .= "<a onclick=\"submitPage('','',$i); return false;\" href=\"$self?page=" . $i . "&q=" .urlencode($search_result) . "\">$i</a>"; } } if($page < $numpages) { $nav .= "<a onclick=\"submitPage('','','$next'); return false;\" href=\"$self?page=" . $next . "&q=" .urlencode($search_result) . "\">Next ></a>"; $last = "<a onclick=\"submitPage('','','$numpages'); return false;\" href=\"$self?page=$numpages&q=" .urlencode($search_result) . "\"> >> </a>"; } else { $nav .= "&nbsp;"; $last = "&nbsp;"; } echo $first . $nav . $last; ?> </div> this is the link which displays on the next page: http://apps.facebook.com/website-folder/search.php?page=2&q=good&_fb_fromhash=[some obscure number]

    Read the article

  • Sharepoint : Access denied when editing a page (because of page layout) or list item

    - by tinky05
    I'm logged in as the System Account, so it's probably not a "real access denied"! What I've done : - A custom master page - A custom page layout from a custom content type (with custom fields) If I add a custom field (aka "content field" in the tools in SPD) in my page layout, I get an access denied when I try to edit a page that comes from that page layout. So, for example, if I add in my page layout this line in a "asp:content" tag : I get an access denied. If I remove it, everyting is fine. (the field "test" is a field that comes from the content type). Any idea? UPDATE Well, I tried in a blank site and it worked fine, so there must be something wrong with my web application :( UPDATE #2 Looks like this line in the master page gives me the access denied : <SharePoint:DelegateControl runat="server" ControlId="PublishingConsole" Visible="false" PrefixHtml="&lt;tr&gt;&lt;td colspan=&quot;0&quot; id=&quot;mpdmconsole&quot; class=&quot;s2i-consolemptablerow&quot;&gt;" SuffixHtml="&lt;/td&gt;&lt;/tr&gt;"></SharePoint:DelegateControl> UPDATE #3 I Found http://odole.wordpress.com/2009/01/30/access-denied-error-message-while-editing-properties-of-any-document-in-a-moss-document-library/ Looks like a similar issue. But our Sharepoint versions are with the latest updates. I'll try to use the code that's supposed to fix the lists and post another update. ** UPDATE #4** OK... I tried the code that I found on the page above (see link) and it seems to fix the thing. I haven't tested the solution at 100% but so far, so good. Here's the code I made for a feature receiver (I used the code posted from the link above) : using System; using System.Collections.Generic; using System.Text; using Microsoft.SharePoint; using System.Xml; namespace MyWebsite.FixAccessDenied { class FixAccessDenied : SPFeatureReceiver { public override void FeatureActivated(SPFeatureReceiverProperties properties) { FixWebField(SPContext.Current.Web); } public override void FeatureDeactivating(SPFeatureReceiverProperties properties) { //throw new Exception("The method or operation is not implemented."); } public override void FeatureInstalled(SPFeatureReceiverProperties properties) { //throw new Exception("The method or operation is not implemented."); } public override void FeatureUninstalling(SPFeatureReceiverProperties properties) { //throw new Exception("The method or operation is not implemented."); } static void FixWebField(SPWeb currentWeb) { string RenderXMLPattenAttribute = "RenderXMLUsingPattern"; SPSite site = new SPSite(currentWeb.Url); SPWeb web = site.OpenWeb(); web.AllowUnsafeUpdates = true; web.Update(); SPField f = web.Fields.GetFieldByInternalName("PermMask"); string s = f.SchemaXml; Console.WriteLine("schemaXml before: " + s); XmlDocument xd = new XmlDocument(); xd.LoadXml(s); XmlElement xe = xd.DocumentElement; if (xe.Attributes[RenderXMLPattenAttribute] == null) { XmlAttribute attr = xd.CreateAttribute(RenderXMLPattenAttribute); attr.Value = "TRUE"; xe.Attributes.Append(attr); } string strXml = xe.OuterXml; Console.WriteLine("schemaXml after: " + strXml); f.SchemaXml = strXml; foreach (SPWeb sites in site.AllWebs) { FixField(sites.Url); } } static void FixField(string weburl) { string RenderXMLPattenAttribute = "RenderXMLUsingPattern"; SPSite site = new SPSite(weburl); SPWeb web = site.OpenWeb(); web.AllowUnsafeUpdates = true; web.Update(); System.Collections.Generic.IList<Guid> guidArrayList = new System.Collections.Generic.List<Guid>(); foreach (SPList list in web.Lists) { guidArrayList.Add(list.ID); } foreach (Guid guid in guidArrayList) { SPList list = web.Lists[guid]; SPField f = list.Fields.GetFieldByInternalName("PermMask"); string s = f.SchemaXml; Console.WriteLine("schemaXml before: " + s); XmlDocument xd = new XmlDocument(); xd.LoadXml(s); XmlElement xe = xd.DocumentElement; if (xe.Attributes[RenderXMLPattenAttribute] == null) { XmlAttribute attr = xd.CreateAttribute(RenderXMLPattenAttribute); attr.Value = "TRUE"; xe.Attributes.Append(attr); } string strXml = xe.OuterXml; Console.WriteLine("schemaXml after: " + strXml); f.SchemaXml = strXml; } } } } Just put that code as a Feature Receiver, and activate it at the root site, it should loop trough all the subsites and fix the lists. SUMMARY You get an ACCESS DENIED when editing a PAGE or an ITEM You still get the error even if you're logged in as the Super Admin of the f****in world (sorry, I spent 3 days on that bug) For me, it happened after an import from another site definition (a cmp file) Actually, it's supposed to be a known bug and it's supposed to be fixed since February 2009, but it looks like it's not. The code I posted above should fix the thing.

    Read the article

  • UIWebView NSURLRequestReloadIgnoringLocalCacheData doesn't actually ignore the cache

    - by dodeskjeggen
    I have a UIWebView object, with the caching-policy specified as: NSURLRequestReloadIgnoringLocalCacheData This should ignore whatever objects are in the local cache and retrieve the latest version of a site from the web. However, after the first load of the site (10 resources in trace, HTTP GET), all subsequent loads of the site only retrieve a small subset of resources (3 resources in trace, HTTP GET). The images all appear to be loaded from some local source. I have confirmed that my sharedURLCache has a memory usage of 0 bytes, and a disk usage of 0 bytes. Whenever the process starts fresh, the full version of the site is retrieved again. This leads me to believe that these resources are being stored in an in-memory cache, but as I noted before, [[NSURLCache sharedURLCache] currentMemoryUsage] returns 0. I have also tried explicitly removing the cached response for my request, but this seems to have no effect. What gives?

    Read the article

  • Nested Row problem

    - by Patrick
    Hi, I'm using the 1kb css grid framework for a site, and although nested rows are apparently supported by the framework, when I try to drop in a nested row it doesn't work! Sorry not to explain it better - the site's here, may be easier to just look at the source: http://2605.co.uk/saf/build/ the grid: /grid.css the stylesheet: /style.css I'm a graphic designer hacking his way through a site he shouldn't be having to build but there's no budget to speak of! Cheers for any help, Patrick

    Read the article

  • Entity Attribute Value Database vs. strict Relational Model Ecommerce question

    - by Dr. Zim
    It is safe to say that the EAV/CR database model is bad. That said, Question: What database model, technique, or pattern should be used to deal with "classes" of attributes describing e-commerce products which can be changed at run time? In a good E-commerce database, you will store classes of options (like TV resolution then have a resolution for each TV, but the next product may not be a TV and not have "TV resolution"). How do you store them, search efficiently, and allow your users to setup product types with variable fields describing their products? If the search engine finds that customers typically search for TVs based on console depth, you could add console depth to your fields, then add a single depth for each tv product type at run time. There is a nice common feature among good e-commerce apps where they show a set of products, then have "drill down" side menus where you can see "TV Resolution" as a header, and the top five most common TV Resolutions for the found set. You click one and it only shows TVs of that resolution, allowing you to further drill down by selecting other categories on the side menu. These options would be the dynamic product attributes added at run time. Further discussion: So long story short, are there any links out on the Internet or model descriptions that could "academically" fix the following setup? I thank Noel Kennedy for suggesting a category table, but the need may be greater than that. I describe it a different way below, trying to highlight the significance. I may need a viewpoint correction to solve the problem, or I may need to go deeper in to the EAV/CR. Love the positive response to the EAV/CR model. My fellow developers all say what Jeffrey Kemp touched on below: "new entities must be modeled and designed by a professional" (taken out of context, read his response below). The problem is: entities add and remove attributes weekly (search keywords dictate future attributes) new entities arrive weekly (products are assembled from parts) old entities go away weekly (archived, less popular, seasonal) The customer wants to add attributes to the products for two reasons: department / keyword search / comparison chart between like products consumer product configuration before checkout The attributes must have significance, not just a keyword search. If they want to compare all cakes that have a "whipped cream frosting", they can click cakes, click birthday theme, click whipped cream frosting, then check all cakes that are interesting knowing they all have whipped cream frosting. This is not specific to cakes, just an example.

    Read the article

  • what does calling ´this´ outside of a jquery plugin refer to

    - by Richard
    Hi, I am using the liveTwitter plugin The problem is that I need to stop the plugin from hitting the Twitter api. According to the documentation I need to do this $("#tab1 .container_twitter_status").each(function(){ this.twitter.stop(); }); Already, the each does not make sense on an id and what does this refer to? Anyway, I get an undefined error. I will paste the plugin code and hope it makes sense to somebody MY only problem thusfar with this plugin is that I need to be able to stop it. thanks in advance, Richard /* * jQuery LiveTwitter 1.5.0 * - Live updating Twitter plugin for jQuery * * Copyright (c) 2009-2010 Inge Jørgensen (elektronaut.no) * Licensed under the MIT license (MIT-LICENSE.txt) * * $Date: 2010/05/30$ */ /* * Usage example: * $("#twitterSearch").liveTwitter('bacon', {limit: 10, rate: 15000}); */ (function($){ if(!$.fn.reverse){ $.fn.reverse = function() { return this.pushStack(this.get().reverse(), arguments); }; } $.fn.liveTwitter = function(query, options, callback){ var domNode = this; $(this).each(function(){ var settings = {}; // Handle changing of options if(this.twitter) { settings = jQuery.extend(this.twitter.settings, options); this.twitter.settings = settings; if(query) { this.twitter.query = query; } this.twitter.limit = settings.limit; this.twitter.mode = settings.mode; if(this.twitter.interval){ this.twitter.refresh(); } if(callback){ this.twitter.callback = callback; } // ..or create a new twitter object } else { // Extend settings with the defaults settings = jQuery.extend({ mode: 'search', // Mode, valid options are: 'search', 'user_timeline' rate: 15000, // Refresh rate in ms limit: 10, // Limit number of results refresh: true }, options); // Default setting for showAuthor if not provided if(typeof settings.showAuthor == "undefined"){ settings.showAuthor = (settings.mode == 'user_timeline') ? false : true; } // Set up a dummy function for the Twitter API callback if(!window.twitter_callback){ window.twitter_callback = function(){return true;}; } this.twitter = { settings: settings, query: query, limit: settings.limit, mode: settings.mode, interval: false, container: this, lastTimeStamp: 0, callback: callback, // Convert the time stamp to a more human readable format relativeTime: function(timeString){ var parsedDate = Date.parse(timeString); var delta = (Date.parse(Date()) - parsedDate) / 1000; var r = ''; if (delta < 60) { r = delta + ' seconds ago'; } else if(delta < 120) { r = 'a minute ago'; } else if(delta < (45*60)) { r = (parseInt(delta / 60, 10)).toString() + ' minutes ago'; } else if(delta < (90*60)) { r = 'an hour ago'; } else if(delta < (24*60*60)) { r = '' + (parseInt(delta / 3600, 10)).toString() + ' hours ago'; } else if(delta < (48*60*60)) { r = 'a day ago'; } else { r = (parseInt(delta / 86400, 10)).toString() + ' days ago'; } return r; }, // Update the timestamps in realtime refreshTime: function() { var twitter = this; $(twitter.container).find('span.time').each(function(){ $(this).html(twitter.relativeTime(this.timeStamp)); }); }, // Handle reloading refresh: function(initialize){ var twitter = this; if(this.settings.refresh || initialize) { var url = ''; var params = {}; if(twitter.mode == 'search'){ params.q = this.query; if(this.settings.geocode){ params.geocode = this.settings.geocode; } if(this.settings.lang){ params.lang = this.settings.lang; } if(this.settings.rpp){ params.rpp = this.settings.rpp; } else { params.rpp = this.settings.limit; } // Convert params to string var paramsString = []; for(var param in params){ if(params.hasOwnProperty(param)){ paramsString[paramsString.length] = param + '=' + encodeURIComponent(params[param]); } } paramsString = paramsString.join("&"); url = "http://search.twitter.com/search.json?"+paramsString+"&callback=?"; } else if(twitter.mode == 'user_timeline') { url = "http://api.twitter.com/1/statuses/user_timeline/"+encodeURIComponent(this.query)+".json?count="+twitter.limit+"&callback=?"; } else if(twitter.mode == 'list') { var username = encodeURIComponent(this.query.user); var listname = encodeURIComponent(this.query.list); url = "http://api.twitter.com/1/"+username+"/lists/"+listname+"/statuses.json?per_page="+twitter.limit+"&callback=?"; } $.getJSON(url, function(json) { var results = null; if(twitter.mode == 'search'){ results = json.results; } else { results = json; } var newTweets = 0; $(results).reverse().each(function(){ var screen_name = ''; var profile_image_url = ''; if(twitter.mode == 'search') { screen_name = this.from_user; profile_image_url = this.profile_image_url; created_at_date = this.created_at; } else { screen_name = this.user.screen_name; profile_image_url = this.user.profile_image_url; // Fix for IE created_at_date = this.created_at.replace(/^(\w+)\s(\w+)\s(\d+)(.*)(\s\d+)$/, "$1, $3 $2$5$4"); } var userInfo = this.user; var linkified_text = this.text.replace(/[A-Za-z]+:\/\/[A-Za-z0-9-_]+\.[A-Za-z0-9-_:%&\?\/.=]+/, function(m) { return m.link(m); }); linkified_text = linkified_text.replace(/@[A-Za-z0-9_]+/g, function(u){return u.link('http://twitter.com/'+u.replace(/^@/,''));}); linkified_text = linkified_text.replace(/#[A-Za-z0-9_\-]+/g, function(u){return u.link('http://search.twitter.com/search?q='+u.replace(/^#/,'%23'));}); if(!twitter.settings.filter || twitter.settings.filter(this)) { if(Date.parse(created_at_date) > twitter.lastTimeStamp) { newTweets += 1; var tweetHTML = '<div class="tweet tweet-'+this.id+'">'; if(twitter.settings.showAuthor) { tweetHTML += '<img width="24" height="24" src="'+profile_image_url+'" />' + '<p class="text"><span class="username"><a href="http://twitter.com/'+screen_name+'">'+screen_name+'</a>:</span> '; } else { tweetHTML += '<p class="text"> '; } tweetHTML += linkified_text + ' <span class="time">'+twitter.relativeTime(created_at_date)+'</span>' + '</p>' + '</div>'; $(twitter.container).prepend(tweetHTML); var timeStamp = created_at_date; $(twitter.container).find('span.time:first').each(function(){ this.timeStamp = timeStamp; }); if(!initialize) { $(twitter.container).find('.tweet-'+this.id).hide().fadeIn(); } twitter.lastTimeStamp = Date.parse(created_at_date); } } }); if(newTweets > 0) { // Limit number of entries $(twitter.container).find('div.tweet:gt('+(twitter.limit-1)+')').remove(); // Run callback if(twitter.callback){ twitter.callback(domNode, newTweets); } // Trigger event $(domNode).trigger('tweets'); } }); } }, start: function(){ var twitter = this; if(!this.interval){ this.interval = setInterval(function(){twitter.refresh();}, twitter.settings.rate); this.refresh(true); } }, stop: function(){ if(this.interval){ clearInterval(this.interval); this.interval = false; } } }; var twitter = this.twitter; this.timeInterval = setInterval(function(){twitter.refreshTime();}, 5000); this.twitter.start(); } }); return this; }; })(jQuery);

    Read the article

  • .htaccess 301 redirect without GET var

    - by tvgemert
    Hi, For a website I'm currently working on we're redirecting our old URL's permanently to new ones like this: Redirect 301 /oldfile.php http://www.site.com/show/newurl Now I come across this situation in which the old url has a get var like: Redirect 301 /oldfile.php?var=name http://www.site.com/show/newurl This will redirect the oldfile to the new url plus it adds the get var so it redirects to: http://www.site.com/show/newurl?var=name How would I set up this redirect without the get var?

    Read the article

  • JQuery - Slide Example

    - by gav
    Hi All, I want to perform a simple slide motion on an HTML element. JQuery is already available on the site in question so the next logical step for me was to look at their documentation. JQuery - Slide down When I check out their demo however, it doesn't seem to be functioning. In firebug they have an error; missing ) after argument list wyciwyg://0/http://docs.jquery.com/UI/Effects/Slide Line 18 Whilst the error seems simple, I can't work out how to correct it (On thier site by editing the JS). On my own site using the same example an error is found in the JQuery 1.4.2 script itself; jQuery.easing[specialEasing || defaultEasing] is not a function file:///home/gav/ee-workspaces/web/site/php/jquery-1.4.2.js Line 5854 I don't mean to sound lazy / rude but what's going on? Is the JQuery site and newest release actually broken, I doubt it, what am I doing wrong? I'm a CS grad with no real web dev experience so I'm not used to this method of debugging, where should I start with this? Thanks, Gav

    Read the article

  • How to page while maintaining the querystring values in ASP.Net Mvc 2

    - by Picflight
    I am using the pager provided by Martijin Boland to implementing paging in my Asp.Net Mvc 2 application. My form uses the GET method to send all parameters to the querystring, it is a search form with several form elements. <% using (Html.BeginForm("SearchResults", "Search", FormMethod.Get)) {%> On the SearchResults View I am trying to implement paging: <div class="pager"> <%= Html.Pager(Model.PageSize, Model.PageNumber, Model.TotalItemCount, new { Request.QueryString })%> </div> The Html.Pager has some overloads which I am not too clear on how to use. The Request.QueryString makes the querystring look like this: http://localhost:1155/Search/SearchResults?QueryString=Distance%3D10%26txtZip%3D%26cb&page=2 Should it not be like this? http://localhost:1155/Search/SearchResults?Distance=20&txtZip=10021&page=2

    Read the article

  • Click Once Deployment Process and Issue Resolution

    - by Geordie
    Introduction We are adopting Click Once as a deployment standard for Thick .Net application clients.  The latest version of this tool has matured it to a point where it can be used in an enterprise environment.  This guide will identify how to use Click Once deployment and promote code trough the dev, test and production environments. Why Use Click Once over SCCM If we already use SCCM why add Click Once to the deployment options.  The advantages of Click Once are their ability to update the code in a single location and have the update flow automatically down to the user community.  There have been challenges in the past with getting configuration updates to download but these can now be achieved.  With SCCM you can do the same thing but it then needs to be packages and pushed out to users.  Each time a new user is added to an application, time needs to be spent by an administrator, to push out any required application packages.  With Click Once the user would go to a web link and the application and pre requisites will automatically get installed. New Deployment Steps Overview The deployment in an enterprise environment includes several steps as the solution moves through the development life cycle before being released into production.  To make mitigate risk during the release phase, it is important to ensure the solution is not deployed directly into production from the development tools.  Although this is the easiest path, it can introduce untested code into production and result in unexpected results. 1. Deploy the client application to a development web server using Visual Studio 2008 Click Once deployment tools.  Once potential production versions of the solution are being generated, ensure the production install URL is specified when deploying code from Visual Studio.  (For details see ‘Deploying Click Once Code from Visual Studio’) 2. xCopy the code to the test server.  Run the MageUI tool to update the URLs, signing and version numbers to match the test server. (For details see ‘Moving Click Once Code to a new Server without using Visual Studio’) 3. xCopy the code to the production server.  Run the MageUI tool to update the URLs, signing and version numbers to match the production server. The certificate used to sign the code should be provided by a certificate authority that will be trusted by the client machines.  Finally make sure the setup.exe contains the production install URL.  If not redeploy the solution from Visual Studio to the dev environment specifying the production install URL.  Then xcopy the install.exe file from dev to production.  (For details see ‘Moving Click Once Code to a new Server without using Visual Studio’) Detailed Deployment Steps Deploying Click Once Code From Visual Studio Open Visual Studio and create a new WinForms or WPF project.   In the solution explorer right click on the project and select ‘Publish’ in the context menu.   The ‘Publish Wizard’ will start.  Enter the development deployment path.  This could be a local directory or web site.  When first publishing the solution set this to a development web site and Visual basic will create a site with an install.htm page.  Click Next.  Select weather the application will be available both online and offline. Then click Finish. Once the initial deployment is completed, republish the solution this time mapping to the directory that holds the code that was just published.  This time the Publish Wizard contains and additional option.   The setup.exe file that is created has the install URL hardcoded in it.  It is this screen that allows you to specify the URL to use.  At some point a setup.exe file must be generated for production.  Enter the production URL and deploy the solution to the dev folder.  This file can then be saved for latter use in deployment to production.  During development this URL should be pointing to development site to avoid accidently installing the production application. Visual studio will publish the application to the desired location in the process it will create an anonymous ‘pfx’ certificate to sign the deployment configuration files.  A production certificate should be acquired in preparation for deployment to production.   Directory structure created by Visual Studio     Application files created by Visual Studio   Development web site (install.htm) created by Visual Studio Migrating Click Once Code to a new Server without using Visual Studio To migrate the Click Once application code to a new server, a tool called MageUI is needed to modify the .application and .manifest files.  The MageUI tool is usually located – ‘C:\Program Files\Microsoft SDKs\Windows\v6.0A\Bin’ folder or can be downloaded from the web. When deploying to a new environment copy all files in the project folder to the new server.  In this case the ‘ClickOnceSample’ folder and contents.  The old application versions can be deleted, in this case ‘ClickOnceSample_1_0_0_0’ and ‘ClickOnceSample_1_0_0_1’.  Open IIS Manager and create a virtual directory that points to the project folder.  Also make the publish.htm the default web page.   Run the ManeUI tool and then open the .application file in the root project folder (in this case in the ‘ClickOnceSample’ folder). Click on the Deployment Options in the left hand list and update the URL to the new server URL and save the changes.   When MageUI tries to save the file it will prompt for the file to be signed.   This step cannot be bypassed if you want the Click Once deployment to work from a web site.  The easiest solution to this for test is to use the auto generated certificate that Visual Studio created for the project.  This certificate can be found with the project source code.   To save time go to File>Preferences and configure the ‘Use default signing certificate’ fields.   Future deployments will only require application files to be transferred to the new server.  The only difference is then updating the .application file the ‘Version’ must be updated to match the new version and the ‘Application Reference’ has to be update to point to the new .manifest file.     Updating the Configuration File of a Click Once Deployment Package without using Visual Studio When an update to the configuration file is required, modifying the ClickOnceSample.exe.config.deploy file will not result in current users getting the new configurations.  We do not want to go back to Visual Studio and generate a new version as this might introduce unexpected code changes.  A new version of the application can be created by copying the folder (in this case ClickOnceSample_1_0_0_2) and pasting it into the application Files directory.  Rename the directory ‘ClickOnceSample_1_0_0_3’.  In the new folder open the configuration file in notepad and make the configuration changes. Run MageUI and open the manifest file in the newly copied directory (ClickOnceSample_1_0_0_3).   Edit the manifest version to reflect the newly copied files (in this case 1.0.0.3).  Then save the file.  Open the .application file in the root folder.  Again update the version to 1.0.0.3.  Since the file has not changed the Deployment Options/Start Location URL should still be correct.  The application Reference needs to be updated to point to the new versions .manifest file.  Save the file. Next time a user runs the application the new version of the configuration file will be down loaded.  It is worth noting that there are 2 different types of configuration parameter; application and user.  With Click Once deployment the difference is significant.  When an application is downloaded the configuration file is also brought down to the client machine.  The developer may have written code to update the user parameters in the application.  As a result each time a new version of the application is down loaded the user parameters are at risk of being overwritten.  With Click Once deployment the system knows if the user parameters are still the default values.  If they are they will be overwritten with the new default values in the configuration file.  If they have been updated by the user, they will not be overwritten. Settings configuration view in Visual Studio Production Deployment When deploying the code to production it is prudent to disable the development and test deployment sites.  This will allow errors such as incorrect URL to be quickly identified in the initial testing after deployment.  If the sites are active there is no way to know if the application was downloaded from the production deployment and not redirected to test or dev.   Troubleshooting Clicking the install button on the install.htm page fails. Error: URLDownloadToCacheFile failed with HRESULT '-2146697210' Error: An error occurred trying to download <file>   This is due to the setup.exe file pointing to the wrong location. ‘The setup.exe file that is created has the install URL hardcoded in it.  It is this screen that allows you to specify the URL to use.  At some point a setup.exe file must be generated for production.  Enter the production URL and deploy the solution to the dev folder.  This file can then be saved for latter use in deployment to production.  During development this URL should be pointing to development site to avoid accidently installing the production application.’

    Read the article

  • C# Active Directory - Check username / password

    - by Michael G
    I'm using the following code on Windows Vista Ultimate SP1 to query our active directory server to check the user name and password of a user on a domain. public Object IsAuthenticated() { String domainAndUsername = strDomain + @"\" + strUser; DirectoryEntry entry = new DirectoryEntry(_path, domainAndUsername, strPass); SearchResult result; try { //Bind to the native AdsObject to force authentication. DirectorySearcher search = new DirectorySearcher(entry) { Filter = ("(SAMAccountName=" + strUser + ")") }; search.PropertiesToLoad.Add("givenName"); // First Name search.PropertiesToLoad.Add("sn"); // Last Name search.PropertiesToLoad.Add("cn"); // Last Name result = search.FindOne(); if (null == result) { return null; } //Update the new path to the user in the directory. _path = result.Path; _filterAttribute = (String)result.Properties["cn"][0]; } catch (Exception ex) { return new Exception("Error authenticating user. " + ex.Message); } return user; } the target is using .NET 3.5, and compiled with VS 2008 standard I'm logged in under a domain account that is a domain admin where the application is running. The code works perfectly on windows XP; but i get the following exception when running it on Vista: System.DirectoryServices.DirectoryServicesCOMException (0x8007052E): Logon failure: unknown user name or bad password. at System.DirectoryServices.DirectoryEntry.Bind(Boolean throwIfFail) at System.DirectoryServices.DirectoryEntry.Bind() at System.DirectoryServices.DirectoryEntry.get_AdsObject() at System.DirectoryServices.DirectorySearcher.FindAll(Boolean findMoreThanOne) at System.DirectoryServices.DirectorySearcher.FindOne() at Chain_Of_Custody.Classes.Authentication.LdapAuthentication.IsAuthenticated() at System.DirectoryServices.DirectoryEntry.Bind(Boolean throwIfFail) at System.DirectoryServices.DirectoryEntry.Bind() at System.DirectoryServices.DirectoryEntry.get_AdsObject() at System.DirectoryServices.DirectorySearcher.FindAll(Boolean findMoreThanOne) at System.DirectoryServices.DirectorySearcher.FindOne() at Chain_Of_Custody.Classes.Authentication.LdapAuthentication.IsAuthenticated() I've tried changing the authentication types, I'm not sure what's going on. See also: http://stackoverflow.com/questions/290548/c-validate-a-username-and-password-against-active-directory

    Read the article

  • Check for Block Ads/Scripts (Browser Addons, Compatibility)

    - by acidzombie24
    I'm conflicted. you guys decide if this should migrate to SU or not. I would like to test my site against popular browser ad ons. ATM i have tested against noscript and adblock plus for firefox. What other popular ad ons should i check compatibility with? By compatibility i mean to work as intent on browsers i support (opera, firefox, chrome, IE 7/8) which include ads. NoScript broke my site and for adblock plus i ask once per week to consider allowing ads. When i see IE6 i notify the user the site is known to be unusable with that browser (The site is script heavy by nature and i wouldnt want to accidentally serve ads to infect users of IE6 with a virus).

    Read the article

  • My -tpl file won't update!

    - by Kyle Sevenoaks
    Hi, I am running the site at www.euroworker.no, it's a linux server and the site has a backend editor. It's a smarty/php site, and when I try to update a few of the .tpl's (two or three) they don't update. I have tried uploading through FTP and that doesn't work either. It runs on the livecart system. any ideas? Thanks!

    Read the article

  • DDD: Aggregate Roots

    - by Mosh
    Hello, I need help with finding my aggregate root and boundary. I have 3 Entities: Plan, PlannedRole and PlannedTraining. Each Plan can include many PlannedRoles and PlannedTrainings. Solution 1: At first I thought Plan is the aggregate root because PlannedRole and PlannedTraining do not make sense out of the context of a Plan. They are always within a plan. Also, we have a business rule that says each Plan can have a maximum of 3 PlannedRoles and 5 PlannedTrainings. So I thought by nominating the Plan as the aggregate root, I can enforce this invariant. However, we have a Search page where the user searches for Plans. The results shows a few properties of the Plan itself (and none of its PlannedRoles or PlannedTrainings). I thought if I have to load the entire aggregate, it would have a lot of overhead. There are nearly 3000 plans and each may have a few children. Loading all these objects together and then ignoring PlannedRoles and PlannedTrainings in the search page doesn't make sense to me. Solution 2: I just realized the user wants 2 more search pages where they can search for Planned Roles or Planned Trainings. That made me realize they are trying to access these objects independently and "out of" the context of Plan. So I thought I was wrong about my initial design and that is how I came up with this solution. So, I thought to have 3 aggregates here, 1 for each Entity. This approach enables me to search for each Entity independently and also resolves the performance issue in solution 1. However, using this approach I cannot enforce the invariant I mentioned earlier. There is also another invariant that states a Plan can be changed only if it is of a certain status. So, I shouldn't be able to add any PlannedRoles or PlannedTrainings to a Plan that is not in that status. Again, I can't enforce this invariant with the second approach. Any advice would be greatly appreciated. Cheers, Mosh

    Read the article

  • ISO-8859-1 to UTF8 in ASP.NET 2

    - by Gordon Carpenter-Thompson
    We've got a page which posts data to our ASP.NET app in ISO-8859-1 <head> <META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=iso-8859-1"> <title>`Sample Search Invoker`</title> </head> <body> <form name="advancedform" method="post" action="SearchResults.aspx"> <input class="field" name="SearchTextBox" type="text" /> <input class="button" name="search" type="submit" value="Search &gt;" /> </form> and in the code behind (SearchResults.aspx.cs) System.Collections.Specialized.NameValueCollection postedValues = Request.Form; String nextKey; for (int i = 0; i < postedValues.AllKeys.Length; i++) { nextKey = postedValues.AllKeys[i]; if (nextKey.Substring(0, 2) != "__") { // Get basic search text if (nextKey.EndsWith(XAEConstants.CONTROL_SearchTextBox)) { // Get search text value String sSentSearchText = postedValues[i]; System.Text.Encoding iso88591 = System.Text.Encoding.GetEncoding("iso-8859-1"); System.Text.Encoding utf8 = System.Text.Encoding.UTF8; byte[] abInput = iso88591.GetBytes(sSentSearchText); sSentSearchText = utf8.GetString(System.Text.Encoding.Convert(iso88591, utf8, abInput)); this.SearchText = sSentSearchText.Replace('<', ' ').Replace('>',' '); this.PreviousSearchText.Value = this.SearchText; } } } When we pass through Merkblätter it gets pulled out of postedValues[i] as Merkbl?tter The raw string string is Merkbl%ufffdtter Any ideas?

    Read the article

  • Why will my AdRotator not display images? (Image paths are correct)

    - by KSwift87
    Hi. I'm writing a simple web application in C# and I've gotten to the part where I must add an AdRotator object and link four images to it. I have done this, but no matter what I do the images will not show up; only the alternate text. It makes no sense because the paths are correct. Supposedly AdRotator controls are really simple to use... But anyway below is my code. Search.aspx: <%@ Page Language="C#" MasterPageFile="~/Site.Master" AutoEventWireup="true" CodeBehind="Search.aspx.cs" Inherits="Module6.WebForm2" Title="Search" %> <asp:Content ID="Content1" ContentPlaceHolderID="head" runat="server"> </asp:Content> <asp:Content ID="Content2" ContentPlaceHolderID="ContentPlaceHolder1" runat="server"> <form id="Search" runat="server"> This is the Search page! <div class="StartCalendar"> <asp:Calendar ID="Calendar1" runat="server" Caption="Start Date" TodayDayStyle-Font-Bold="true" TodayDayStyle-ForeColor="Crimson" SelectedDayStyle-BackColor="DarkCyan" /> </div> <div class="EndCalendar"> <asp:Calendar ID="Calendar2" runat="server" Caption="End Date" TodayDayStyle-Font-Bold="true" TodayDayStyle-ForeColor="Crimson" SelectedDayStyle-BackColor="DarkCyan" /> </div> <br /><br /> <div class="Search"> <asp:Button ID="btnSearch" runat="server" Text="Search" UseSubmitBehavior="true" /> </div><br /><br /> <div class="CenterAd"> <asp:AdRotator ID="AdRotator1" runat="server" Target="_blank" AdvertisementFile="~/Advertisements.xml" /> </div> <br /><br /> <div class="Results"> <asp:GridView ID="gvResults" runat="server" /> </div> </form> </asp:Content> Advertisements.xml: <?xml version="1.0" encoding="utf-8" ?> <Advertisements> <Ad> <ImageURL>~/images/colts.jpg</ImageURL> <AlternateText>Colts Image</AlternateText> </Ad> <Ad> <ImageURL>~/images/conseco.gif</ImageURL> <AlternateText>Conseco Image</AlternateText> </Ad> <Ad> <ImageURL>~/images/IndianapolisIndians.png</ImageURL> <AlternateText>Indianapolis Indians Image</AlternateText> </Ad> <Ad> <ImageURL>~/images/pacers.gif</ImageURL> <AlternateText>Pacers Image</AlternateText> </Ad> </Advertisements> Any and all help is GREATLY appreciated.

    Read the article

  • GlassFish Security Realm, Active Directory and Referral

    - by Allan Lykke Christensen
    I've setup up a Security Realm in Glassfish to authenticate against an Active Directory server. The configuration of the realm is as follows: Class Name: com.sun.enterprise.security.auth.realm.ldap.LDAPRealm JAAS context: ldapRealm Directory: ldap://172.16.76.10:389/ Base DN: dc=smallbusiness,dc=local search-filter: (&(objectClass=user)(sAMAccountName=%s)) group-search-filter: (&(objectClass=group)(member=%d)) search-bind-dn: cN=Administrator,CN=Users,dc=smallbusiness,dc=local search-bind-password: abcd1234! The realm is functional and I can log-in, but when ever I log in I get the following error in the log: SEC1106: Error during LDAP search with filter [(&(objectClass=group)(member=CN=Administrator,CN=Users,dc=smallbusiness,dc=local))]. SEC1000: Caught exception. javax.naming.PartialResultException: Unprocessed Continuation Reference(s); remaining name 'dc=smallbusiness,dc=local' at com.sun.jndi.ldap.LdapCtx.processReturnCode(LdapCtx.java:2820) .... .... ldaplm.searcherror While searching for a solution I found that it was recommended to add java.naming.referral=follow to the properties of the realm. However, after I add this it takes 20 minutes for GlassFish to authenticate against Active Directory. I suspect it is a DNS problem on the Active Directory server. The Active Directory server is a vanilla Windows Server 2003 setup in a Virtual Machine. Any help/recommendation is highly appreciated!

    Read the article

  • addthis api? Is it possible to bookmark another URL?

    - by Newbie
    Hello! At the moment, I try to include addthis social-bookmarks to my site. My problem is, that I have to bookmark another link than the url on my site. Instead of www.example.com, I have to bookmark www.example.com?media=my-media-file. From the beginning: A User comes to my site. On my site are several movies. He can click on a movie thumbnail and watch the movie without reloading the site (javascript). Now he can click on my social bookmark icon to share this movie. Now the addthis framework thinks that the User want's to share www.example.com (cause this is whats in my url). But I want the user to share www.example.com?media=foo-bar . Opening www.example.com?media=foo-bar for the correct movie is still working, but no social bookmaring. I hope you understand my problem and can help me. I searched without success in the addthis api. So, can u help me?

    Read the article

  • GetVirtualPath not making any sense.

    - by HeavyWave
    Can anyone explain why this code with the given routes returns the first route? routes.MapRoute(null, "user/approve", new { controller = "Users", action = "Approve" }), routes.MapRoute(null, "user/{username}", new { controller = "Users", action = "Profile" }), routes.MapRoute(null, "user/{username}/{action}", new { controller = "Users" }), routes.MapRoute(null, "user/{username}/{action}/{id}", new { controller = "Users" }), routes.MapRoute(null, "search/{query}", new { controller = "Artists", action = "Search", page = 1 }), routes.MapRoute(null, "search/{query}/{page}", new { controller = "Artists", action = "Search" }), routes.MapRoute(null, "music", new { controller = "Artists", action = "Index", page = 1 }), routes.MapRoute(null, "music/page/{page}", new { controller = "Artists", action = "Index" }) var pageLinkValueDictionary = new RouteValueDictionary(this.linkWithoutPageValuesDictionary); pageLinkValueDictionary.Add("page", pageNumber); var virtualPathData = RouteTable.Routes.GetVirtualPath(this.viewContext.RequestContext, pageLinkValueDictionary); Here GetVirtualPath always returns user/approve, although there is no {page} parameter in the route. Furthermore, everything works as expected without the first route. I have found this link http://www.codeplex.com/aspnet/WorkItem/View.aspx?WorkItemId=2033 but it wasn't very helpful. It looks like GetVirtualPath was not implemented with large collections of routes in mind. I am using ASP.Net MVC 1.0.

    Read the article

  • em vs px... for mobile browsers...

    - by jitendra
    For desktop browser all modern browser uses Zoom functionality so we can use PX but if same site can be seen on mobile then would px not be good for zooming in mobile browsers. or use of px is also fine for mobile browsers. even if we don't care for IE 6 , should we use em in place of px still if we are not making different site for mobile, same site will be seen on both desktop and mobile phones (iphone, blackberry, windows mobile, opera mini, android etc?

    Read the article

  • Widgets and .mobi sites, is this possible?

    - by Roland
    I have a couple of concerns, I'm busy building a normal .mobi site for a client, so basically how I understand this is keep it simply since most phones do not support JavaScript and have a small screen etc. So I build a mobi site using only content and basic links. Now my question is how do mobi widgets work on a mobile site? I've googled and could not find a answer? Is this possible at all?

    Read the article

  • How to retain the values of the filters and its result using asp.net c#?

    - by user144842
    Question:- Page is a typical search page with few filters on it. When search for records based on filters, it shows result in Gridview. From grid view records, user can click on any record to see the details which takes the focus on new page. Its working fine so far. Now when user comes back from details page to search page. I am loosing selected filters values and no result in grid view. How can i display selected filters and its results in gridview when user is coming back on search page? Any example etc.? FYI, I am using sessions to pass parameters to the ObjectDatasource.

    Read the article

  • disable SSL on MAMP

    - by morktron
    Hi I'm used to editing sites locally on my MAMP to test out changes before going live. In this case though the site has a SSL certificate and wants to use it when I go to admin. So I can't go to admin. The error message says: (Error code: ssl_error_rx_record_too_long) It's a Joomla site I'm trying to log into locally ie: http://localhost:8888/site/administrator/ I've tried https as well, but same thing. Also same thing is Safari. I just need to turn off ssl, it must be a file somewhere in the site I downloaded.

    Read the article

< Previous Page | 344 345 346 347 348 349 350 351 352 353 354 355  | Next Page >