Search Results

Search found 24378 results on 976 pages for 'pinned site'.

Page 68/976 | < Previous Page | 64 65 66 67 68 69 70 71 72 73 74 75  | Next Page >

  • How to use google map for my site?

    - by user75472
    Hi, I have tried to find the link or document to know how to use google map for my site. I am using php. I need code for this. Though I tried a lot to find, may be the way I tried was not correct. Can anybody suggest me the link where I can find the code for google map use. Thanks in advance.........

    Read the article

  • Site login using cURL

    - by user311129
    I am using cURL for the first time. I need to login to a site and my code is not working. help. i have problem whit setting cookie curl_setopt($page, CURLOPT_COOKIEJAR, $cookiesPath);

    Read the article

  • troubleshooting steps for site that requires refresh to load

    - by user1691389
    How does one troubleshoot a site that loads sometimes, but then requires a reload at other times... sometimes it loads, sometimes it hangs and a refresh is required, sometimes more than once. What would you do in this situation? I'm just looking for basic troubleshooting steps to start me going in the right direction. In the meantime I'll be poking around in Chrome's "Inspect Element" but if there's specific things I should look at first let me know.

    Read the article

  • Loading cross domain XML with Javascript using a hybrid iframe-proxy/xsl/jsonp concept?

    - by Josef
    On our site www.foo.com we want to download and use http://feeds.foo.com/feed.xml with Javascript. We'll obviously use Access-Control but for browsers that don't support it we are considering the following as a fallback: On www.foo.com, we set document.domain, provide a callback function and load the feed into a (hidden) iframe: document.domain = 'foo.com'; function receive_data(data) { // process data }; var proxy = document.createElement('iframe'); proxy.src = 'http://feeds.foo.com/feed.xml'; document.body.appendChild(proxy); On feeds.foo.com, add an XSL to feed.xml and use it to transform the feed into an html document that also sets document.domain and calls the callback function in its parent with the feed data as json: <?xml version="1.0"?> <xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0"> <xsl:template match="ROOT"> <html><body> <script type="text/javascript"> document.domain = 'foo.com'; parent.receive_data([<xsl:apply-templates/>]); </script> </body></html> </xsl:template> <!-- templates that transform data into json objects go here --> </xsl:stylesheet> Is there a better way to load XML from feeds.foo.com and what are the ramifications of this iframe-proxy/xslt/jsonp trick? (..and in what cases will it fail?) Remarks This does not work in Safari & Chrome but since both support Access-Control it's fine. We want little or no change to feeds.foo.com We are aware of (but not interested in) server-side proxy solutions update: wrote about it

    Read the article

  • Windows Azure: Import/Export Hard Drives, VM ACLs, Web Sockets, Remote Debugging, Continuous Delivery, New Relic, Billing Alerts and More

    - by ScottGu
    Two weeks ago we released a giant set of improvements to Windows Azure, as well as a significant update of the Windows Azure SDK. This morning we released another massive set of enhancements to Windows Azure.  Today’s new capabilities include: Storage: Import/Export Hard Disk Drives to your Storage Accounts HDInsight: General Availability of our Hadoop Service in the cloud Virtual Machines: New VM Gallery, ACL support for VIPs Web Sites: WebSocket and Remote Debugging Support Notification Hubs: Segmented customer push notification support with tag expressions TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services Developer Analytics: New Relic support for Web Sites + Mobile Services Service Bus: Support for partitioned queues and topics Billing: New Billing Alert Service that sends emails notifications when your bill hits a threshold you define All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them. Storage: Import/Export Hard Disk Drives to Windows Azure I am excited to announce the preview of our new Windows Azure Import/Export Service! The Windows Azure Import/Export Service enables you to move large amounts of on-premises data into and out of your Windows Azure Storage accounts. It does this by enabling you to securely ship hard disk drives directly to our Windows Azure data centers. Once we receive the drives we’ll automatically transfer the data to or from your Windows Azure Storage account.  This enables you to import or export massive amounts of data more quickly and cost effectively (and not be constrained by available network bandwidth). Encrypted Transport Our Import/Export service provides built-in support for BitLocker disk encryption – which enables you to securely encrypt data on the hard drives before you send it, and not have to worry about it being compromised even if the disk is lost/stolen in transit (since the content on the transported hard drives is completely encrypted and you are the only one who has the key to it).  The drive preparation tool we are shipping today makes setting up bitlocker encryption on these hard drives easy. How to Import/Export your first Hard Drive of Data You can read our Getting Started Guide to learn more about how to begin using the import/export service.  You can create import and export jobs via the Windows Azure Management Portal as well as programmatically using our Server Management APIs. It is really easy to create a new import or export job using the Windows Azure Management Portal.  Simply navigate to a Windows Azure storage account, and then click the new Import/Export tab now available within it (note: if you don’t have this tab make sure to sign-up for the Import/Export preview): Then click the “Create Import Job” or “Create Export Job” commands at the bottom of it.  This will launch a wizard that easily walks you through the steps required: For more comprehensive information about Import/Export, refer to Windows Azure Storage team blog.  You can also send questions and comments to the [email protected] email address. We think you’ll find this new service makes it much easier to move data into and out of Windows Azure, and it will dramatically cut down the network bandwidth required when working on large data migration projects.  We hope you like it. HDInsight: 100% Compatible Hadoop Service in the Cloud Last week we announced the general availability release of Windows Azure HDInsight. HDInsight is a 100% compatible Hadoop service that allows you to easily provision and manage Hadoop clusters for big data processing in Windows Azure.  This release is now live in production, backed by an enterprise SLA, supported 24x7 by Microsoft Support, and is ready to use for production scenarios. HDInsight allows you to use Apache Hadoop tools, such as Pig and Hive, to process large amounts of data in Windows Azure Blob Storage. Because data is stored in Windows Azure Blob Storage, you can choose to dynamically create Hadoop clusters only when you need them, and then shut them down when they are no longer required (since you pay only for the time the Hadoop cluster instances are running this provides a super cost effective way to use them).  You can create Hadoop clusters using either the Windows Azure Management Portal (see below) or using our PowerShell and Cross Platform Command line tools: The import/export hard drive support that came out today is a perfect companion service to use with HDInsight – the combination allows you to easily ingest, process and optionally export a limitless amount of data.  We’ve also integrated HDInsight with our Business Intelligence tools, so users can leverage familiar tools like Excel in order to analyze the output of jobs.  You can find out more about how to get started with HDInsight here. Virtual Machines: VM Gallery Enhancements Today’s update of Windows Azure brings with it a new Virtual Machine gallery that you can use to create new VMs in the cloud.  You can launch the gallery by doing New->Compute->Virtual Machine->From Gallery within the Windows Azure Management Portal: The new Virtual Machine Gallery includes some nice enhancements that make it even easier to use: Search: You can now easily search and filter images using the search box in the top-right of the dialog.  For example, simply type “SQL” and we’ll filter to show those images in the gallery that contain that substring. Category Tree-view: Each month we add more built-in VM images to the gallery.  You can continue to browse these using the “All” view within the VM Gallery – or now quickly filter them using the category tree-view on the left-hand side of the dialog.  For example, by selecting “Oracle” in the tree-view you can now quickly filter to see the official Oracle supplied images. MSDN and Supported checkboxes: With today’s update we are also introducing filters that makes it easy to filter out types of images that you may not be interested in. The first checkbox is MSDN: using this filter you can exclude any image that is not part of the Windows Azure benefits for MSDN subscribers (which have highly discounted pricing - you can learn more about the MSDN pricing here). The second checkbox is Supported: this filter will exclude any image that contains prerelease software, so you can feel confident that the software you choose to deploy is fully supported by Windows Azure and our partners. Sort options: We sort gallery images by what we think customers are most interested in, but sometimes you might want to sort using different views. So we’re providing some additional sort options, like “Newest,” to customize the image list for what suits you best. Pricing information: We now provide additional pricing information about images and options on how to cost effectively run them directly within the VM Gallery. The above improvements make it even easier to use the VM Gallery and quickly create launch and run Virtual Machines in the cloud. Virtual Machines: ACL Support for VIPs A few months ago we exposed the ability to configure Access Control Lists (ACLs) for Virtual Machines using Windows PowerShell cmdlets and our Service Management API. With today’s release, you can now configure VM ACLs using the Windows Azure Management Portal as well. You can now do this by clicking the new Manage ACL command in the Endpoints tab of a virtual machine instance: This will enable you to configure an ordered list of permit and deny rules to scope the traffic that can access your VM’s network endpoints. For example, if you were on a virtual network, you could limit RDP access to a Windows Azure virtual machine to only a few computers attached to your enterprise. Or if you weren’t on a virtual network you could alternatively limit traffic from public IPs that can access your workloads: Here is the default behaviors for ACLs in Windows Azure: By default (i.e. no rules specified), all traffic is permitted. When using only Permit rules, all other traffic is denied. When using only Deny rules, all other traffic is permitted. When there is a combination of Permit and Deny rules, all other traffic is denied. Lastly, remember that configuring endpoints does not automatically configure them within the VM if it also has firewall rules enabled at the OS level.  So if you create an endpoint using the Windows Azure Management Portal, Windows PowerShell, or REST API, be sure to also configure your guest VM firewall appropriately as well. Web Sites: Web Sockets Support With today’s release you can now use Web Sockets with Windows Azure Web Sites.  This feature enables you to easily integrate real-time communication scenarios within your web based applications, and is available at no extra charge (it even works with the free tier).  Higher level programming libraries like SignalR and socket.io are also now supported with it. You can enable Web Sockets support on a web site by navigating to the Configure tab of a Web Site, and by toggling Web Sockets support to “on”: Once Web Sockets is enabled you can start to integrate some really cool scenarios into your web applications.  Check out the new SignalR documentation hub on www.asp.net to learn more about some of the awesome scenarios you can do with it. Web Sites: Remote Debugging Support The Windows Azure SDK 2.2 we released two weeks ago introduced remote debugging support for Windows Azure Cloud Services. With today’s Windows Azure release we are extending this remote debugging support to also work with Windows Azure Web Sites. With live, remote debugging support inside of Visual Studio, you are able to have more visibility than ever before into how your code is operating live in Windows Azure. It is now super easy to attach the debugger and quickly see what is going on with your application in the cloud. Remote Debugging of a Windows Azure Web Site using VS 2013 Enabling the remote debugging of a Windows Azure Web Site using VS 2013 is really easy.  Start by opening up your web application’s project within Visual Studio. Then navigate to the “Server Explorer” tab within Visual Studio, and click on the deployed web-site you want to debug that is running within Windows Azure using the Windows Azure->Web Sites node in the Server Explorer.  Then right-click and choose the “Attach Debugger” option on it: When you do this Visual Studio will remotely attach the debugger to the Web Site running within Windows Azure.  The debugger will then stop the web site’s execution when it hits any break points that you have set within your web application’s project inside Visual Studio.  For example, below I set a breakpoint on the “ViewBag.Message” assignment statement within the HomeController of the standard ASP.NET MVC project template.  When I hit refresh on the “About” page of the web site within the browser, the breakpoint was triggered and I am now able to debug the app remotely using Visual Studio: Note above how we can debug variables (including autos/watchlist/etc), as well as use the Immediate and Command Windows. In the debug session above I used the Immediate Window to explore some of the request object state, as well as to dynamically change the ViewBag.Message property.  When we click the the “Continue” button (or press F5) the app will continue execution and the Web Site will render the content back to the browser.  This makes it super easy to debug web apps remotely. Tips for Better Debugging To get the best experience while debugging, we recommend publishing your site using the Debug configuration within Visual Studio’s Web Publish dialog. This will ensure that debug symbol information is uploaded to the Web Site which will enable a richer debug experience within Visual Studio.  You can find this option on the Web Publish dialog on the Settings tab: When you ultimately deploy/run the application in production we recommend using the “Release” configuration setting – the release configuration is memory optimized and will provide the best production performance.  To learn more about diagnosing and debugging Windows Azure Web Sites read our new Troubleshooting Windows Azure Web Sites in Visual Studio guide. Notification Hubs: Segmented Push Notification support with tag expressions In August we announced the General Availability of Windows Azure Notification Hubs - a powerful Mobile Push Notifications service that makes it easy to send high volume push notifications with low latency from any mobile app back-end.  Notification hubs can be used with any mobile app back-end (including ones built using our Mobile Services capability) and can also be used with back-ends that run in the cloud as well as on-premises. Beginning with the initial release, Notification Hubs allowed developers to send personalized push notifications to both individual users as well as groups of users by interest, by associating their devices with tags representing the logical target of the notification. For example, by registering all devices of customers interested in a favorite MLB team with a corresponding tag, it is possible to broadcast one message to millions of Boston Red Sox fans and another message to millions of St. Louis Cardinals fans with a single API call respectively. New support for using tag expressions to enable advanced customer segmentation With today’s release we are adding support for even more advanced customer targeting.  You can now identify customers that you want to send push notifications to by defining rich tag expressions. With tag expressions, you can now not only broadcast notifications to Boston Red Sox fans, but take that segmenting a step farther and reach more granular segments. This opens up a variety of scenarios, for example: Offers based on multiple preferences—e.g. send a game day vegetarian special to users tagged as both a Boston Red Sox fan AND a vegetarian Push content to multiple segments in a single message—e.g. rain delay information only to users who are tagged as either a Boston Red Sox fan OR a St. Louis Cardinal fan Avoid presenting subsets of a segment with irrelevant content—e.g. season ticket availability reminder to users who are tagged as a Boston Red Sox fan but NOT also a season ticket holder To illustrate with code, consider a restaurant chain app that sends an offer related to a Red Sox vs Cardinals game for users in Boston. Devices can be tagged by your app with location tags (e.g. “Loc:Boston”) and interest tags (e.g. “Follows:RedSox”, “Follows:Cardinals”), and then a notification can be sent by your back-end to “(Follows:RedSox || Follows:Cardinals) && Loc:Boston” in order to deliver an offer to all devices in Boston that follow either the RedSox or the Cardinals. This can be done directly in your server backend send logic using the code below: var notification = new WindowsNotification(messagePayload); hub.SendNotificationAsync(notification, "(Follows:RedSox || Follows:Cardinals) && Loc:Boston"); In your expressions you can use all Boolean operators: AND (&&), OR (||), and NOT (!).  Some other cool use cases for tag expressions that are now supported include: Social: To “all my group except me” - group:id && !user:id Events: Touchdown event is sent to everybody following either team or any of the players involved in the action: Followteam:A || Followteam:B || followplayer:1 || followplayer:2 … Hours: Send notifications at specific times. E.g. Tag devices with time zone and when it is 12pm in Seattle send to: GMT8 && follows:thaifood Versions and platforms: Send a reminder to people still using your first version for Android - version:1.0 && platform:Android For help on getting started with Notification Hubs, visit the Notification Hub documentation center.  Then download the latest NuGet package (or use the Notification Hubs REST APIs directly) to start sending push notifications using tag expressions.  They are really powerful and enable a bunch of great new scenarios. TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services With today’s Windows Azure release we are making it really easy to enable continuous delivery support with Windows Azure and Team Foundation Services.  Team Foundation Services is a cloud based offering from Microsoft that provides integrated source control (with both TFS and Git support), build server, test execution, collaboration tools, and agile planning support.  It makes it really easy to setup a team project (complete with automated builds and test runners) in the cloud, and it has really rich integration with Visual Studio. With today’s Windows Azure release it is now really easy to enable continuous delivery support with both TFS and Git based repositories hosted using Team Foundation Services.  This enables a workflow where when code is checked in, built successfully on an automated build server, and all tests pass on it – I can automatically have the app deployed on Windows Azure with zero manual intervention or work required. The below screen-shots demonstrate how to quickly setup a continuous delivery workflow to Windows Azure with a Git-based ASP.NET MVC project hosted using Team Foundation Services. Enabling Continuous Delivery to Windows Azure with Team Foundation Services The project I’m going to enable continuous delivery with is a simple ASP.NET MVC project whose source code I’m hosting using Team Foundation Services.  I did this by creating a “SimpleContinuousDeploymentTest” repository there using Git – and then used the new built-in Git tooling support within Visual Studio 2013 to push the source code to it.  Below is a screen-shot of the Git repository hosted within Team Foundation Services: I can access the repository within Visual Studio 2013 and easily make commits with it (as well as branch, merge and do other tasks).  Using VS 2013 I can also setup automated builds to take place in the cloud using Team Foundation Services every time someone checks in code to the repository: The cool thing about this is that I don’t have to buy or rent my own build server – Team Foundation Services automatically maintains its own build server farm and can automatically queue up a build for me (for free) every time someone checks in code using the above settings.  This build server (and automated testing) support now works with both TFS and Git based source control repositories. Connecting a Team Foundation Services project to Windows Azure Once I have a source repository hosted in Team Foundation Services with Automated Builds and Testing set up, I can then go even further and set it up so that it will be automatically deployed to Windows Azure when a source code commit is made to the repository (assuming the Build + Tests pass).  Enabling this is now really easy.  To set this up with a Windows Azure Web Site simply use the New->Compute->Web Site->Custom Create command inside the Windows Azure Management Portal.  This will create a dialog like below.  I gave the web site a name and then made sure the “Publish from source control” checkbox was selected: When we click next we’ll be prompted for the location of the source repository.  We’ll select “Team Foundation Services”: Once we do this we’ll be prompted for our Team Foundation Services account that our source repository is hosted under (in this case my TFS account is “scottguthrie”): When we click the “Authorize Now” button we’ll be prompted to give Windows Azure permissions to connect to the Team Foundation Services account.  Once we do this we’ll be prompted to pick the source repository we want to connect to.  Starting with today’s Windows Azure release you can now connect to both TFS and Git based source repositories.  This new support allows me to connect to the “SimpleContinuousDeploymentTest” respository we created earlier: Clicking the finish button will then create the Web Site with the continuous delivery hooks setup with Team Foundation Services.  Now every time someone pushes source control to the repository in Team Foundation Services, it will kick off an automated build, run all of the unit tests in the solution , and if they pass the app will be automatically deployed to our Web Site in Windows Azure.  You can monitor the history and status of these automated deployments using the Deployments tab within the Web Site: This enables a really slick continuous delivery workflow, and enables you to build and deploy apps in a really nice way. Developer Analytics: New Relic support for Web Sites + Mobile Services With today’s Windows Azure release we are making it really easy to enable Developer Analytics and Monitoring support with both Windows Azure Web Site and Windows Azure Mobile Services.  We are partnering with New Relic, who provide a great dev analytics and app performance monitoring offering, to enable this - and we have updated the Windows Azure Management Portal to make it really easy to configure. Enabling New Relic with a Windows Azure Web Site Enabling New Relic support with a Windows Azure Web Site is now really easy.  Simply navigate to the Configure tab of a Web Site and scroll down to the “developer analytics” section that is now within it: Clicking the “add-on” button will display some additional UI.  If you don’t already have a New Relic subscription, you can click the “view windows azure store” button to obtain a subscription (note: New Relic has a perpetually free tier so you can enable it even without paying anything): Clicking the “view windows azure store” button will launch the integrated Windows Azure Store experience we have within the Windows Azure Management Portal.  You can use this to browse from a variety of great add-on services – including New Relic: Select “New Relic” within the dialog above, then click the next button, and you’ll be able to choose which type of New Relic subscription you wish to purchase.  For this demo we’ll simply select the “Free Standard Version” – which does not cost anything and can be used forever:  Once we’ve signed-up for our New Relic subscription and added it to our Windows Azure account, we can go back to the Web Site’s configuration tab and choose to use the New Relic add-on with our Windows Azure Web Site.  We can do this by simply selecting it from the “add-on” dropdown (it is automatically populated within it once we have a New Relic subscription in our account): Clicking the “Save” button will then cause the Windows Azure Management Portal to automatically populate all of the needed New Relic configuration settings to our Web Site: Deploying the New Relic Agent as part of a Web Site The final step to enable developer analytics using New Relic is to add the New Relic runtime agent to our web app.  We can do this within Visual Studio by right-clicking on our web project and selecting the “Manage NuGet Packages” context menu: This will bring up the NuGet package manager.  You can search for “New Relic” within it to find the New Relic agent.  Note that there is both a 32-bit and 64-bit edition of it – make sure to install the version that matches how your Web Site is running within Windows Azure (note: you can configure your Web Site to run in either 32-bit or 64-bit mode using the Web Site’s “Configuration” tab within the Windows Azure Management Portal): Once we install the NuGet package we are all set to go.  We’ll simply re-publish the web site again to Windows Azure and New Relic will now automatically start monitoring the application Monitoring a Web Site using New Relic Now that the application has developer analytics support with New Relic enabled, we can launch the New Relic monitoring portal to start monitoring the health of it.  We can do this by clicking on the “Add Ons” tab in the left-hand side of the Windows Azure Management Portal.  Then select the New Relic add-on we signed-up for within it.  The Windows Azure Management Portal will provide some default information about the add-on when we do this.  Clicking the “Manage” button in the tray at the bottom will launch a new browser tab and single-sign us into the New Relic monitoring portal associated with our account: When we do this a new browser tab will launch with the New Relic admin tool loaded within it: We can now see insights into how our app is performing – without having to have written a single line of monitoring code.  The New Relic service provides a ton of great built-in monitoring features allowing us to quickly see: Performance times (including browser rendering speed) for the overall site and individual pages.  You can optionally set alert thresholds to trigger if the speed does not meet a threshold you specify. Information about where in the world your customers are hitting the site from (and how performance varies by region) Details on the latency performance of external services your web apps are using (for example: SQL, Storage, Twitter, etc) Error information including call stack details for exceptions that have occurred at runtime SQL Server profiling information – including which queries executed against your database and what their performance was And a whole bunch more… The cool thing about New Relic is that you don’t need to write monitoring code within your application to get all of the above reports (plus a lot more).  The New Relic agent automatically enables the CLR profiler within applications and automatically captures the information necessary to identify these.  This makes it super easy to get started and immediately have a rich developer analytics view for your solutions with very little effort. If you haven’t tried New Relic out yet with Windows Azure I recommend you do so – I think you’ll find it helps you build even better cloud applications.  Following the above steps will help you get started and deliver you a really good application monitoring solution in only minutes. Service Bus: Support for partitioned queues and topics With today’s release, we are enabling support within Service Bus for partitioned queues and topics. Enabling partitioning enables you to achieve a higher message throughput and better availability from your queues and topics. Higher message throughput is achieved by implementing multiple message brokers for each partitioned queue and topic.  The  multiple messaging stores will also provide higher availability. You can create a partitioned queue or topic by simply checking the Enable Partitioning option in the custom create wizard for a Queue or Topic: Read this article to learn more about partitioned queues and topics and how to take advantage of them today. Billing: New Billing Alert Service Today’s Windows Azure update enables a new Billing Alert Service Preview that enables you to get proactive email notifications when your Windows Azure bill goes above a certain monetary threshold that you configure.  This makes it easier to manage your bill and avoid potential surprises at the end of the month. With the Billing Alert Service Preview, you can now create email alerts to monitor and manage your monetary credits or your current bill total.  To set up an alert first sign-up for the free Billing Alert Service Preview.  Then visit the account management page, click on a subscription you have setup, and then navigate to the new Alerts tab that is available: The alerts tab allows you to setup email alerts that will be sent automatically once a certain threshold is hit.  For example, by clicking the “add alert” button above I can setup a rule to send myself email anytime my Windows Azure bill goes above $100 for the month: The Billing Alert Service will evolve to support additional aspects of your bill as well as support multiple forms of alerts such as SMS.  Try out the new Billing Alert Service Preview today and give us feedback. Summary Today’s Windows Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • scraping website with javascript cookie with c#

    - by erwin
    Hi all, I want to scrap some things from the following site: http://www.conrad.nl/modelspoor This is my function: public string SreenScrape(string urlBase, string urlPath) { CookieContainer cookieContainer = new CookieContainer(); HttpWebRequest httpWebRequest = (HttpWebRequest)WebRequest.Create(urlBase + urlPath); httpWebRequest.CookieContainer = cookieContainer; httpWebRequest.UserAgent = "Mozilla/6.0 (Windows; U; Windows NT 7.0; en-US; rv:1.9.0.8) Gecko/2009032609 Firefox/3.0.9 (.NET CLR 3.5.30729)"; WebResponse webResponse = httpWebRequest.GetResponse(); string result = new System.IO.StreamReader(webResponse.GetResponseStream(), Encoding.Default).ReadToEnd(); webResponse.Close(); if (result.Contains("<frame src=")) { Regex metaregex = new Regex("http:[a-z:/._0-9!?=A-Z&]*",RegexOptions.Multiline); result = result.Replace("\r\n", ""); Match m = metaregex.Match(result); string key = m.Groups[0].Value; foreach (Match match in metaregex.Matches(result)) { HttpWebRequest redirectHttpWebRequest = (HttpWebRequest)WebRequest.Create(key); redirectHttpWebRequest.CookieContainer = cookieContainer; webResponse = redirectHttpWebRequest.GetResponse(); string redirectResponse = new System.IO.StreamReader(webResponse.GetResponseStream(), Encoding.Default).ReadToEnd(); webResponse.Close(); return redirectResponse; } } return result; } But when i do this i get a string with an error from the website that it use javascript. Does anybody know how to fix this? Kind regards Erwin

    Read the article

  • Windows-1251 file inside UTF-8 site?

    - by Spoonk
    Hello everyone Masters Of Web Delevopment :) I have a piece of PHP script that fetches last 10 played songs from my winamp. This script is inside file (lets call it "lastplayed.php") which is included in my site with php include function inside a "div". My site is on UTF-8 encoding. The problem is that some songs titles are in Windows-1251 encoding. And in my site they displays like "??????"... Is there any known way to tell to this div with included "lastplayed.php" in it, to be with windows-1251 encoding? Or any other suggestions? P.S: The file with fetching script a.k.a. "lastplayed.php", is converted to UTF-8. But if it is ANCII it's the same result. I try to put and meta tag with windows-1251 between head tag but nothing happens again. P.P.S: Script that fetches the Winamp's data (lastplayed.php): <?php /****** * You may use and/or modify this script as long as you: * 1. Keep my name & webpage mentioned * 2. Don't use it for commercial purposes * * If you want to use this script without complying to the rules above, please contact me first at: [email protected] * * Author: Martijn Korse * Website: http://devshed.excudo.net * * Date: 08-05-2006 ***/ /** * version 2.0 */ class Radio { var $fields = array(); var $fieldsDefaults = array("Server Status", "Stream Status", "Listener Peak", "Average Listen Time", "Stream Title", "Content Type", "Stream Genre", "Stream URL", "Current Song"); var $very_first_str; var $domain, $port, $path; var $errno, $errstr; var $trackLists = array(); var $isShoutcast; var $nonShoutcastData = array( "Server Status" => "n/a", "Stream Status" => "n/a", "Listener Peak" => "n/a", "Average Listen Time" => "n/a", "Stream Title" => "n/a", "Content Type" => "n/a", "Stream Genre" => "n/a", "Stream URL" => "n/a", "Stream AIM" => "n/a", "Stream IRC" => "n/a", "Current Song" => "n/a" ); var $altServer = False; function Radio($url) { $parsed_url = parse_url($url); $this->domain = isset($parsed_url['host']) ? $parsed_url['host'] : ""; $this->port = !isset($parsed_url['port']) || empty($parsed_url['port']) ? "80" : $parsed_url['port']; $this->path = empty($parsed_url['path']) ? "/" : $parsed_url['path']; if (empty($this->domain)) { $this->domain = $this->path; $this->path = ""; } $this->setOffset("Current Stream Information"); $this->setFields(); // setting default fields $this->setTableStart("<table border=0 cellpadding=2 cellspacing=2>"); $this->setTableEnd("</table>"); } function setFields($array=False) { if (!$array) $this->fields = $this->fieldsDefaults; else $this->fields = $array; } function setOffset($string) { $this->very_first_str = $string; } function setTableStart($string) { $this->tableStart = $string; } function setTableEnd($string) { $this->tableEnd = $string; } function getHTML($page=False) { if (!$page) $page = $this->path; $contents = ""; $domain = (substr($this->domain, 0, 7) == "http://") ? substr($this->domain, 7) : $this->domain; if (@$fp = fsockopen($domain, $this->port, $this->errno, $this->errstr, 2)) { fputs($fp, "GET ".$page." HTTP/1.1\r\n". "User-Agent: Mozilla/4.0 (compatible; MSIE 5.5; Windows 98)\r\n". "Accept: */*\r\n". "Host: ".$domain."\r\n\r\n"); $c = 0; while (!feof($fp) && $c <= 20) { $contents .= fgets($fp, 4096); $c++; } fclose ($fp); preg_match("/(Content-Type:)(.*)/i", $contents, $matches); if (count($matches) > 0) { $contentType = trim($matches[2]); if ($contentType == "text/html") { $this->isShoutcast = True; return $contents; } else { $this->isShoutcast = False; $htmlContent = substr($contents, 0, strpos($contents, "\r\n\r\n")); $dataStr = str_replace("\r", "\n", str_replace("\r\n", "\n", $contents)); $lines = explode("\n", $dataStr); foreach ($lines AS $line) { if ($dp = strpos($line, ":")) { $key = substr($line, 0, $dp); $value = trim(substr($line, ($dp+1))); if (preg_match("/genre/i", $key)) $this->nonShoutcastData['Stream Genre'] = $value; if (preg_match("/name/i", $key)) $this->nonShoutcastData['Stream Title'] = $value; if (preg_match("/url/i", $key)) $this->nonShoutcastData['Stream URL'] = $value; if (preg_match("/content-type/i", $key)) $this->nonShoutcastData['Content Type'] = $value; if (preg_match("/icy-br/i", $key)) $this->nonShoutcastData['Stream Status'] = "Stream is up at ".$value."kbps"; if (preg_match("/icy-notice2/i", $key)) { $this->nonShoutcastData['Server Status'] = "This is <span style=\"color: red;\">not</span> a Shoutcast server!"; if (preg_match("/ultravox/i", $value)) $this->nonShoutcastData['Server Status'] .= " But an <a href=\"http://ultravox.aol.com/\" target=\"_blank\">Ultravox</a> Server"; $this->altServer = $value; } } } return nl2br($htmlContent); } } else return $contents; } else { return False; } } function getServerInfo($display_array=null, $very_first_str=null) { if (!isset($display_array)) $display_array = $this->fields; if (!isset($very_first_str)) $very_first_str = $this->very_first_str; if ($html = $this->getHTML()) { // parsing the contents $data = array(); foreach ($display_array AS $key => $item) { if ($this->isShoutcast) { $very_first_pos = stripos($html, $very_first_str); $first_pos = stripos($html, $item, $very_first_pos); $line_start = strpos($html, "<td>", $first_pos); $line_end = strpos($html, "</td>", $line_start) + 4; $difference = $line_end - $line_start; $line = substr($html, $line_start, $difference); $data[$key] = strip_tags($line); } else { $data[$key] = $this->nonShoutcastData[$item]; } } return $data; } else { return $this->errstr." (".$this->errno.")"; } } function createHistoryArray($page) { if (!in_array($page, $this->trackLists)) { $this->trackLists[] = $page; if ($html = $this->getHTML($page)) { $fromPos = stripos($html, $this->tableStart); $toPos = stripos($html, $this->tableEnd, $fromPos); $tableData = substr($html, $fromPos, ($toPos-$fromPos)); $lines = explode("</tr><tr>", $tableData); $tracks = array(); $c = 0; foreach ($lines AS $line) { $info = explode ("</td><td>", $line); $time = trim(strip_tags($info[0])); if (substr($time, 0, 9) != "Copyright" && !preg_match("/Tag Loomis, Tom Pepper and Justin Frankel/i", $info[1])) { $this->tracks[$c]['time'] = $time; $this->tracks[$c++]['track'] = trim(strip_tags($info[1])); } } if (count($this->tracks) > 0) { unset($this->tracks[0]); if (isset($this->tracks[1])) $this->tracks[1]['track'] = str_replace("Current Song", "", $this->tracks[1]['track']); } } else { $this->tracks[0] = array("time"=>$this->errno, "track"=>$this->errstr); } } } function getHistoryArray($page="/played.html") { if (!in_array($page, $this->trackLists)) $this->createHistoryArray($page); return $this->tracks; } function getHistoryTable($page="/played.html", $trackColText=False, $class=False) { $title_utf8 = mb_convert_encoding($trackArr ,"utf-8" ,"auto"); if (!in_array($page, $this->trackLists)) $this->createHistoryArray($page); if ($trackColText) $output .= " <div class='lastplayed_top'></div> <div".($class ? " class=\"".$class."\"" : "").">"; foreach ($this->tracks AS $title_utf8) $output .= "<div style='padding:2px 0;'>".$title_utf8['track']."</div>"; $output .= "</div><div class='lastplayed_bottom'></div> <div class='lastplayed_title'>".$trackColText."</div> \n"; return $output; } } // this is needed for those with a php version < 5 // the function is copied from the user comments @ php.net (http://nl3.php.net/stripos) if (!function_exists("stripos")) { function stripos($haystack, $needle, $offset=0) { return strpos(strtoupper($haystack), strtoupper($needle), $offset); } } ?> And the calling script outside the lastplayed.php: include "lastplayed.php"; $radio = new Radio($ip.":".$port); echo $radio->getHistoryTable("/played.html", "<b>Last played:</b>", "lastplayed_content");

    Read the article

  • Editting CSS in iframe that sets Select tag's text color to black?

    - by Corey Ogburn
    This is a very specific question for a Google Chrome extension. http://www.meebo.com/mobile/ This page is where you're kicked to when you go to Meebo.com on an iPhone or Droid phone. But if you notice, the Status box where you can set yourself away or what you want your status to be has white text on a white background. In order to get a website to appear in a Google Chrome extension's popup window (the one that drops down when you click the icon next to the address bar) that isn't an included html file in the extension, I need to use an iFrame. I know that there's security measures about Cross-Site stuff like javascript and I'm not surprised I'm having trouble accessing the CSS. But there's a class, status, and it's color is white and I need to change that to black. I've tested it with Chrome's Inspect Element window and if I change that, I'll be fine. I've tried changing the manifest.json file to inject a CSS file using Content-Scripts, but nothing... I'm new to Chrome Extensions but I have experience doing web development.

    Read the article

  • Creating DOM elements on the fly - check if the data is not harmful

    - by user313353
    I already posted a question closely related to the this one. I watched the Mix10 video with P. Haacked and S. Hanselman. I am building an AJAX-powered site whose input forms are created on the fly. All the code to accomplish this is done within a script tag or a javascript file. For example the following DOM elements are created when the page loads and are wrapped into an existing div defined in a view: $('#myform').append('); $('#myform').append(''); When I click the submit button I need to get the values of the input form whose id is 'Name': $("#Name").val() and then I return a Json object: { Name: name }; For this kind of scenario there is no way to use Html.Encode() or AntiXss.HtmlEncode() on the client-side. The only way to check if the input is not harmful is done on the server-side (via a service layer). This seems a limitation. All is fine if and only if a view has a set of predefined inputs. When it is time to create them on the fly, the situation is different. Have you thought of that situation guys? Thanks for the attention you have put on this. Roland Brussels, Belgium

    Read the article

  • Can you change the icon of a pinned IE 9 web application? And how do you do it?

    - by Rick Roth
    In IE 9 you have the ability to click and drag an open browser tab to the Windows 7 taskbar and pin the shortcut to the taskbar. This has the effect of creating a pseudo-application experience where the shortcut can have it's own custom jumplist and is not grouped with other IE 9 browser tabs on the taskbar. Windows uses the "shortcut icon" or "favicon" defined in the HTML for the icon on the taskbar. If no shortcut icon is defined, then the generic IE shortcut icon is used. If you have a bunch of these shortcuts pinned to the taskbar that don't have different icons it can be confusing to the user which is which. Can you change the icon of a pinned IE 9 web application? And how do you do it?

    Read the article

  • Tridion New UI Preview Site is not reflecting with the changes unless pulished

    - by Ram G
    I have new UI setup and noticing that when ever I update a page it is not refreshing with the updated changes. I do not see either the page_{sessionId/GUID}.aspx created either. Checked the session preview DB and I see the changes in PAGE_CONTENT table with new rendered content, so seems like session preview is working fine but the Preview site is not able to get the changes and refresh the UI. I have checked all the preview handlers and mappings for .aspx and made sure they are correct in web.config. Any thoughts on why the preview site not showing up the changes? I have the session preview DB setup in cd_storage_conf.xml. <StorageBindings> <Bundle src="preview_dao_bundle.xml"/> </StorageBindings> <Wrappers> <Wrapper Name="SessionWrapper"> <Timeout>120000</Timeout> <Storage Type="persistence" Id="db-session-webservice" dialect="MSSQL" Class="com.tridion.storage.persistence.JPADAOFactory"> <Pool Type="jdbc" Size="5" MonitorInterval="60" IdleTimeout="120" CheckoutTimeout="120" /> <DataSource Class="com.microsoft.sqlserver.jdbc.SQLServerDataSource"> <Property Name="serverName" Value="localhost" /> <Property Name="portNumber" Value="1433" /> <Property Name="databaseName" Value="Tridion_Broker_SessionPreview" /> <Property Name="user" Value="usr" /> <Property Name="password" Value="pwd" /> </DataSource> </Storage> </Wrapper> </Wrappers> web.config (handlers): <add verb="GET" path="*.htm" type="Tridion.ContentDelivery.Preview.Web.StaticFileHandler" /> <add verb="GET" path="*.jpg" type="Tridion.ContentDelivery.Preview.Web.StaticFileHandler" /> <add verb="GET" path="*.png" type="Tridion.ContentDelivery.Preview.Web.StaticFileHandler" /> <add verb="GET" path="*.aspx" type="Tridion.ContentDelivery.Preview.Web.StaticFileHandler" /> <add verb="GET" path="*.html" type="Tridion.ContentDelivery.Preview.Web.StaticFileHandler" /> <add name="Tridion.ContentDelivery.Preview.Web.PreviewContentModule" type="Tridion.ContentDelivery.Preview.Web.PreviewContentModule" /> Log (timestamp and DEBUG prefix removed): ClaimStore - put: uri=taf:session:id, value=tridion_db59279b-7d37-4b2e-ad98-eaaa6af7038e ClaimStore - put: uri=taf:session:id, value=tridion_db59279b-7d37-4b2e-ad98-eaaa6af7038e ClaimStore - put: uri=taf:tracking:id, value=tridion_d1fa1017-a28d-4f48-a790-b74f78c69314 ClaimStore - put: uri=taf:tracking:id, value=tridion_d1fa1017-a28d-4f48-a790-b74f78c69314 SearchClaimProcessor - No match found for referrer string http://uidemo.practice.com/en/Product/musk.aspx SearchClaimProcessor - No match found for referrer string http://uidemo.practice.com/en/Product/musk.aspx ClaimStore - put: uri=taf:claim:ambientdata:footprintcartridge:devicetype, value=Desktop ClaimStore - put: uri=taf:claim:ambientdata:footprintcartridge:devicetype, value=Desktop ClaimStore - put: uri=taf:claim:ambientdata:footprintcartridge:mobiledevice, value=NotMobile ClaimStore - put: uri=taf:claim:ambientdata:footprintcartridge:acceptlanguage, value=en-US ClaimStore - put: uri=taf:claim:ambientdata:footprintcartridge:mobiledevice, value=NotMobile ClaimStore - put: uri=taf:claim:ambientdata:footprintcartridge:acceptlanguage, value=en-US PageHandler - The session wrappers are correctly installed. Any thoughts/pointers on what might be going wrong...? (sorry for the long post)

    Read the article

  • clock and date showing on a live site but not on localhost

    - by grumpypanda
    I've got clock.swf and date.swf working fine on a live site, now I am using the same code to set up a local develop environment. Everything is working well except the clock.swf and date.swf stopped working on localhost. Two same yellow errors "You need to update your Flash plugin. Click here if you want to continue." but of course my Flash player is up to date since the live site is working fine. I'll post the code below which I think has caused the error. I've been searching online for the last couple of hours but no luck, anyone has got into an issue like this before? What can be the possible cause? Any help is appreciated. This is on the index.php, I can post more code here if needed. <?php embed_flash("swf/clock.swf", CLOCK_WIDTH, CLOCK_HEIGHT, "8", '', "flashcontent");?> <?php embed_flash("swf/date.swf", DATE_WIDTH, DATE_HEIGHT, "8", '', "flashcontent_date");?> configure.php define('CLOCK_WIDTH', '450'); define('CLOCK_HEIGHT', ''); define('DATE_WIDTH', '440'); define('DATE_HEIGHT', ''); flash_function.php <?php function embed_flash($name, $w, $h, $version, $bgcolor, $id) { $cacheBuster = rand(); $padTop = $h/3; ?> <style> a.noflash:link, a.noflash:visited, a.noflash:active {color: #1860C2; text-decoration: none; background:#FFFFFF;} a.noflash:hover {color:#000; text-decoration:none; background:#EEEEEE;} .message { width: <?=$w;?>px; font-size:12px; font-weight:normal; margin-bottom: 10px; padding: 5px; color: #EEE; background: orange;"} </style> <div id="<?=$id; ?>" align="center"> <noscript> <div class="message"> Please enable <a href="https://www.google.com/support/adsense/bin/answer.py?answer=12654" target="_blank" class="noflash">&nbsp;JavaScript&nbsp;</a> to view this page properly. </div> </noscript> <div class="message"> You need to update your Flash plugin. Click <a href="http://www.adobe.com/shockwave/download/download.cgi?P1_Prod_Version=ShockwaveFlash&promoid=BIOW" target="_blank" class="noflash">&nbsp;here&nbsp;</a> if you want to continue. </div> </div> <script type="text/javascript"> // <![CDATA[ var so = new SWFObject("<?=$name;?>", "", "<?=$w;?>", "<?=$h;?>", "<?=$version;?>", "<?=$bgcolor;?>"); so.addParam("quality", "high"); so.addParam("allowScriptAccess", "sameDomain"); so.addParam("scale", "showall"); so.addParam("loop", "false"); so.addParam("wmode", "transparent"); so.write("<?=$id;?>"); // ]]> </script>

    Read the article

  • REST WCF service locks thread when called using AJAX in an ASP.Net site

    - by Jupaol
    I have a WCF REST service consumed in an ASP.Net site, from a page, using AJAX. I want to be able to call methods from my service async, which means I will have callback handlers in my javascript code and when the methods finish, the output will be updated. The methods should run in different threads, because each method will take different time to complete their task I have the code semi-working, but something strange is happening because the first time I execute the code after compiling, it works, running each call in a different threads but subsequent calls blocs the service, in such a way that each method call has to wait until the last call ends in order to execute the next one. And they are running on the same thread. I have had the same problem before when I was using Page Methods, and I solved it by disabling the session in the page but I have not figured it out how to do the same when consuming WCF REST services Note: Methods complete time (running them async should take only 7 sec and the result should be: Execute1 - Execute3 - Execute2) Execute1 -- 2 sec Execute2 -- 7 sec Execute3 -- 4 sec Output After compiling Output subsequent calls (this is the problem) I will post the code...I'll try to simplify it as much as I can Service Contract [ServiceContract( SessionMode = SessionMode.NotAllowed )] public interface IMyService { // I have other 3 methods like these: Execute2 and Execute3 [OperationContract] [WebInvoke( RequestFormat = WebMessageFormat.Json, ResponseFormat = WebMessageFormat.Json, UriTemplate = "/Execute1", Method = "POST")] string Execute1(string param); } [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] [ServiceBehavior( InstanceContextMode = InstanceContextMode.PerCall )] public class MyService : IMyService { // I have other 3 methods like these: Execute2 (7 sec) and Execute3(4 sec) public string Execute1(string param) { var t = Observable.Start(() => Thread.Sleep(2000), Scheduler.NewThread); t.First(); return string.Format("Execute1 on: {0} count: {1} at: {2} thread: {3}", param, "0", DateTime.Now.ToString(), Thread.CurrentThread.ManagedThreadId.ToString()); } } ASPX page <%@ Page EnableSessionState="False" Title="Home Page" Language="C#" MasterPageFile="~/Site.master" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="RestService._Default" %> <asp:Content ID="HeaderContent" runat="server" ContentPlaceHolderID="HeadContent"> <script type="text/javascript"> function callMethodAsync(url, data) { $("#message").append("<br/>" + new Date()); $.ajax({ cache: false, type: "POST", async: true, url: url, data: '"de"', contentType: "application/json", dataType: "json", success: function (msg) { $("#message").append("<br/>&nbsp;&nbsp;&nbsp;" + msg); }, error: function (xhr) { alert(xhr.responseText); } }); } $(function () { $("#callMany").click(function () { $("#message").html(""); callMethodAsync("/Execute1", "hello"); callMethodAsync("/Execute2", "crazy"); callMethodAsync("/Execute3", "world"); }); }); </script> </asp:Content> <asp:Content ID="BodyContent" runat="server" ContentPlaceHolderID="MainContent"> <input type="button" id="callMany" value="Post Many" /> <div id="message"> </div> </asp:Content> Web.config (relevant) <system.webServer> <modules runAllManagedModulesForAllRequests="true" /> </system.webServer> <system.serviceModel> <serviceHostingEnvironment aspNetCompatibilityEnabled="true" multipleSiteBindingsEnabled="true" /> <standardEndpoints> <webHttpEndpoint> <standardEndpoint name="" helpEnabled="true" automaticFormatSelectionEnabled="true" /> </webHttpEndpoint> </standardEndpoints> </system.serviceModel> Global.asax void Application_Start(object sender, EventArgs e) { RouteTable.Routes.Ignore("{resource}.axd/{*pathInfo}"); RouteTable.Routes.Add(new ServiceRoute("", new WebServiceHostFactory(), typeof(MyService))); }

    Read the article

  • Where should I go with hosting my site: VPS, GAE, another option?

    - by Jonathan Hayward
    My website, http://JonathansCorner.com/, began life before 1994 as www.imsa.edu/~jhayward/ and has been through various iterations and improvements to content, HTML, and the like, but remains a literature site that is from a web administrator's perspective fairly simple and primitive: a fair amount of static HTML and supporting files, a little bit of CGI and URI rewriting, .htaccess files providing Expires: headers and the like. An associated site demoes various CGI scripts that fall under the category of "and other creations"; the site as a whole has the purpose of sharing my creative works, and so far a fairly rudimentary use of Apache functionality, supported by Unix tools to, for instance, update RSS feed and the "starting point" link on the home page, has served that purpose fairly well. I looked around here on web hosting, and found the note on web host reccommendations as a good note for "What are some of people's favorite web hosts overall," but I wanted to ask a more focused question of "What are the best web hosts for criteria XYZ:" I am looking at a VPS so I will have root, be able to install stuff and edit Apache's config files etc., running Gentoo or other Linux, BSD, or the like. I would like a system that is secure enough that the host's vulnerabilities are mostly the ones that come along with what I am trying to do: that is, I won't be trying to administer and secure an ancient Linux like some have complained about at 1and1. I would like good uptime/reliability and competent support staff: if the level 1 help desk is going to tell me to go to "My Computer" on a Linux box, I'd like to be able to get past them. Ideally I would like a site hosted within some place that will have low latency for U.S. visitors in particular. I would like a hosting solution that will be with a stable business, one that will probably be around, and one unlikely to vanish without warning. With those things specified, I would be interested in knowing what are the less expensive options. (I expect that some of the things I've specified will knock out all of the cheapest options, but I'm still interested in price.) With all that stated, I'd like to back up a bit and look at whether I am asking the right question. I am concerned that the above is a very good way of asking, "How can I keep my site in line with the wave of the past?" I am wondering if it might be specifically wiser to look to adapt my site to newer technologies instead of trying to keep it on older technologies. For instance, while I would hardly portray my site as a way to show off the full power of Google App Engine, the main site at least should be a straightforward port if I were to do that. And beyond Google App Engine, my knowledge of cloud solutions is basic. If it is a better and more future-proof solution to port my site to another kind of solution, I would be interested in knowing where those future-proof solutions lie. So I would be interested in wisdom. If the question I asked in detail is still a good question to be asking, what would people suggest? Or if I should seriously consider porting my site to a newer basic option, what should I try there? Any thoughts would be appreciated.

    Read the article

  • windows authentication vs sql server authentication for asp.net forms authentication site

    - by Brij
    I have a database and a site having forms authentication. It is working fine with VS2008. This time, I am using "Trusted_connection =True". But when it is opened from outside or directly from browser then I am getting error "Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON'." I know this is due to permission. SQL server is based on windows authentication. What is the best approach to manage user to connect sql server? Should I enable sql server authentication? Let me know what to do so that it makes the production feel and there wouldn't be any problem during deployment. Note: Sql server is installed on domain server.

    Read the article

  • Cross-site json rpc : Python server side and Mozilla extension using Javascript client side

    - by jknair
    hello, I am building a mozilla extension that contacts a python application on a remote server to send and receive data. The python application can be used using xml-rpc from a python console.I am given the task to design a json-rpc that would contact the same application.Making the python server side has been easy which can be accesed using python console but making the mozilla extension to connect to the python serverside is what i am not understanding howto ??? how do i make cross site json rpc calls i have gone through a lot of libraries that i can find on googling but none of them seem to work i am not sure if it is because of same origin policy or my server side not able to process the data ??? ANY HELP

    Read the article

  • get site source code as register user(c#)

    - by nir143
    hi. i downloaded a sourcecode of a site,but i downloaded it i saw it identify my program as a guest,i search at google and figure out that i can send a cookie when i "ask" the source code. that what i have managed to do and it still dont identify me as register user: CookieContainer cj = new; CookieContainer(); string all = ""; HttpWebRequest req = (HttpWebRequest)WebRequest.Create(Url); req.CookieContainer = cj; HttpWebResponse res = (HttpWebResponse)req.GetResponse(); CookieCollection cs=cj.GetCookies(req.RequestUri); CookieContainer cc = new CookieContainer(); cc.Add(cs); req.CookieContainer = cc; StreamReader read = new StreamReader(res.GetResponseStream()); all = read.ReadToEnd(); read.Close(); return all; what is wrong here? tyvm for help:)

    Read the article

  • Silverlight getting opened site

    - by Rafal
    Hi Let's say I have a MainPage that has a buuton. I also have another page(Page2) that hase a textbox. I woud like to add a simple text "TEXT" to the textbox in Page2 while navigating by the button from MainPage. I have a problem with getting actually opened Site. In Windows Forms applications it is solved by: Page2 page2 = (Page2)Application.OpenedForms["Page2"]; page2.TextBox.Text = "TEXT"; How can i do it in Silverlight? HELP")

    Read the article

  • Jabber integration into site and multiple domains

    - by Jamie
    Hi all, I'm creating a centralised web product, i.e. a customer gets assigned a personal domain, for example company1.example.com or company2.example.com and can then use our service. I'm planning to integrate a jabber service into the website. I have already found a decent jabber client library which I can use for the site. I know that jabberd2 uses mysql, which is perfect because I want to use the web interface to add users, delete users, view message logs etc. However, my problem is when I have two companies or more. I would like a jabber server which can host multiple domains, i.e. jabber.company1.example.com, jabber.company2.example.com Do you have any experience of this? And do you know of a good jabber server which will do this? Any help would be hugely appreciated!

    Read the article

  • Incorrect site usage report (Sharepoint)

    - by Jorge E. Barsoba
    Hi... After restarting "Windows Sharepoint Services Timer" service the "Site usage report" shows only one day of activity (matches the day we restarted the service) while every other day has zeros. I took a look in the "\LOGS\" dir and is plenty of logs with the visit's information for the whole year! Why the usage report ignores it? I think maybe the log file of the day showed could have some kind of mark that the other ones doesn't... Thanks, Jorge.

    Read the article

  • Best Open Source Project Hosting Site

    - by Cristian
    I want to start an open source project, but the rise in hosting sites leaves me a little paralyzed with choice. I know a little about several: I never really liked SourceForge's UI but it still feels like the site I think of when I think "open source project hosting". Google Code Project Hosting looks clean and useful but doesn't seem as feature complete as SourceForge. I've heard good things about Launchpad but don't know much about it nor do I know Bazaar (though I'd be interested in learning it). I know almost nothing about GitHub and, like Bazaar, I don't know Git. Does anyone have any experience with these sites or some other cool code host? Any recommendations? Recommended Sites: BitBucket Codeplex Assembla DevjaVu Savannah

    Read the article

  • codeigniter site, swfobject not fully loading swf

    - by Joe
    I am having a strange problem. I have a view that is supposed to load a swf. The swf was compiled with Flex and the mxml preloader displays but it loads a blank screen. When I path directly to the file it loads fully and works fine. Other possibly relevant information: The swf makes calls through GET requests to the database the site is built with codeigniter I'm using swfobject to load the swf you can see it in all it's busted glory here: http://thetoad.flattoads.com:16080/~iopdev/CI/index.php?c=moodtotem&m=index I'm going bonkers over this!

    Read the article

  • ASP.NET 3.5 Web Site stopped importing System namespace by default

    - by Alexis
    I have a VB Web Site project that has recently (and mysteriously) stopped importing the "System" namespace by default. I'm having to either place a "Imports System" line at the top of each code behind, or preface everything with "System", which is fairly annoying, not to mention redundant. I can't for the life of me figure out how to get the System namespace back to being imported by default. I've already checked to see that WINDOWS\Microsoft.NET\Framework\v2.0.50727\CONFIG\web.config contains the <add namespace="System"/> line--it does. That was my best lead. I'm tearing my hair out. Does anyone have any suggestions?

    Read the article

  • Django Per-site caching using memcached

    - by Paul
    Hi, So I'm using per-site caching on a project and I've observed the following, which is kind of confusing. When I load a flat page in my browser then change it through admin and then do a refresh (within the cache timeout) there is no change in the page--as expected. However when I stat a new session in a different browser and load the page (still within the timeout) the app is hit instead of the cache, with the Isn't the cache key generated from the URL? it seems that the session state is getting in there somewhere, which is causing a cache miss. Any ideas? thanks MIDDLEWARE_CLASSES = ( 'django.middleware.cache.UpdateCacheMiddleware', 'django.middleware.common.CommonMiddleware', 'django.contrib.sessions.middleware.SessionMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', 'django.middleware.gzip.GZipMiddleware', 'django.middleware.http.ConditionalGetMiddleware', 'django.middleware.doc.XViewMiddleware', 'ittybitty.middleware.IttyBittyURLMiddleware', 'django.contrib.flatpages.middleware.FlatpageFallbackMiddleware', 'maintenancemode.middleware.MaintenanceModeMiddleware', 'djangodblog.middleware.DBLogMiddleware', 'SSL.middleware.SSLRedirect', #SSL middleware to handle SSL 'django.middleware.cache.FetchFromCacheMiddleware', )

    Read the article

  • Prevent bot from crawling certain areas of site.

    - by Skoder
    Hey, I don't know much about SEO and how web spiders work, so forgive my ignorance here. I'm creating a site (using ASP.NET-MVC) which has areas that displays information retrieved from the database. The data is unique to the user, so there's no real server-side output caching going on. However, since the data can contain things the user may not wish to have displayed from search engine results, I'd like to prevent any spiders from accessing the search results page. Are there any special actions I should take to ensure that the search result directory isn't crawled? Also, would a spider even crawl a page that's dynamically generated and would any actions preventing certain directories being search mess up my search engine rankings? edit: I should add, I'm reading up on robots.txt protocol, but it relies on co-operation from the web crawler. However, I'd also like to prevent any data-mining users who will ignore the robots.txt file. I appreciate any help!

    Read the article

< Previous Page | 64 65 66 67 68 69 70 71 72 73 74 75  | Next Page >