Search Results

Search found 21235 results on 850 pages for 'www'.

Page 232/850 | < Previous Page | 228 229 230 231 232 233 234 235 236 237 238 239  | Next Page >

  • how to maintain session in cURL in php?

    - by newbie programmer
    how can we maintain session in cURL? i'am having a code the sends login details of a site and logs in successfully i need to get the session maintained at the site to continue. here is my code that used to login to the site using cURL <?php $socket = curl_init(); curl_setopt($socket, CURLOPT_URL, "http://www.XXXXXXX.com"); curl_setopt($socket, CURLOPT_REFERER, "http://www.XXXXXXX.com"); curl_setopt($socket, CURLOPT_POST, true); curl_setopt($socket, CURLOPT_USERAGENT, $agent); curl_setopt($socket, CURLOPT_POSTFIELDS, "form_logusername=XXXXX&form_logpassword=XXXXX"); curl_setopt($socket, CURLOPT_COOKIESESSION, true); curl_setopt($socket, CURLOPT_COOKIEJAR, "cookies.txt"); curl_setopt($socket, CURLOPT_COOKIEFILE, "cookies.txt"); $data = curl_exec($socket); curl_close($socket); ?>

    Read the article

  • sites integration

    - by bassem ala
    hey anyone had this task before ? suppose you have a website called www.a.com and a second www.b.com the the a.com have a button. when you logged in to a.com and click the button you will be send to b.com the question is how can i do this while keeping the credentials of the user when he is sent to b.com i want him automatically logged in to b.com ...PS: every user has an account in a.com he definitely should have an account in b.com .... any ideas please???.... thank u for you ideas and supports :)

    Read the article

  • AWS ECS API - ItemLookup - Get item's (EAN) SalesRank by BrowseNode

    - by Lysender
    I'm using amazon japan, using ItemLookup. We already have the EAN numbers (used as ItemId). It was working well until we need to modify the SalesRank. The previous SalesRank returned is for the whole SearchIndex, ex: Books, Music, etc. What we want is to get the SalesRank for a specific BrowseNode. I don't know but it seems that it is not implemented on Amazon US. For example: http: // www . amazon . com / Slackware-Essentials-Cantrell-Johnson-Lumens/dp/1571763384/ref=sr_1_1?ie=UTF8&s=books&qid=1276142663&sr=1-1 The book above only shows the SalesRank (Bestsellers Rank) but on Japan Amazon: http://www.amazon.co.jp/Slackware-Linux-Dummies/dp/0764506897/ref=sr_1_4?ie=UTF8&s=english-books&qid=1276142708&sr=1-4 It shows SalesRank for other three different BrowseNodes. We need to get the SalesRank for a specific BrowseNode for a given ItemId or by any other means. Any ideas guys? TIA

    Read the article

  • jquery Iframe src attribute

    - by alex
    Why does x alerts undefined for iframe but works for embed. I'm grabbing the iframe or embed code from a textarea <iframe src="http://www.youtube.com/embed/9kiWvkj2ldWiU?hd=1"></iframe> var textarea = $('#embedModal textarea'), textareaValue = textarea.val(), $embed = $($(textareaValue).find('iframe')), x = $embed.attr('src'); alert(x); //alerts undefined for iframe If you change find('iframe') to find('embed') and you try with the below embed code. then i'm able to get the value of src, but with iframe i get undefined. Seems strange. <embed src="http://www.youtube.com/embed/9kiWvkj2ldWiU?hd=1"></embed>

    Read the article

  • Scheduler for asp.net ?

    - by user359706
    Is there a schedule control in asp.net. What I need: column display: users Rows display : months and days. On clicking cell will open a popup In popup we can : - select a status in a dropdownList, - if the status is "be close" = two calendars ( date start and end) - then apply a color for the selected period. I know I would not find an exact need control, but I want a component that would be closest. Somethink like http://www.daypilot.org/ or http://www.codeproject.com/KB/webforms/EventCalendarControl.aspx. hoping to be clearly understandable. thank you for your help will be precious to me.

    Read the article

  • How can I set my root directory based on a cookie?

    - by Kacey
    I'm attempting to use my htaccess file to set my root folder based on a cookie that is set through the website. Unfortunately, I haven't been able to find any information on how to do this. Here is what I have tried doing, which I know is incorrect, but it was the best I could find to work with. RewriteCond %{HTTP_HOST} ^(www\.)?domain\.(int|com)$ [NC,OR] RewriteCond %{REQUEST_URI} !^/s3i.flat/ RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{HTTP_COOKIE} s=n2014 [NC] RewriteRule ^(.*)$ /s3i.flat/$1 RewriteRule ^(.*)$ /s3i.2014/$1 RewriteCond %{HTTP_HOST} ^(www\.)?domain\.(int|com)$ [NC] RewriteCond %{HTTP_COOKIE} s=n2014 [NC] RewriteRule ^(/)?$ s3i.flat/ [L] RewriteRule ^(/)?$ s3i.2014/ [L] What I'm trying to do is use the s3i.flat directory if the cookie named s is equal to n2014, and the other directory if the cookie is not set, or does not equal that. Is anyone aware of a way that this can be done? Thanks in advance.

    Read the article

  • Is there any way that I can draw fast a chart with too many series? [on hold]

    - by Maryam Bagheri
    I have a project that need to draw a fast chart with many series. I must change start position of drawing to a favor position when user want. .Net Chart is slow in lots of series. I wanted to use this library: http://www.codeproject.com/Articles/32836/A-simple-C-library-for-graph-plotting?msg=4109664#xx4109664xx but loading speed is too slow when count of series increases. When I plot the chart, I must invalidate the chart for updates, drawing speed will be too slow if there are too many series. I need at least 900 series in a chart for this project. I wanted to use multiple layer series to decrease invalidate rate with the idea: http://www.codeproject.com/Articles/84833/Drawing-multiple-layers-without-flicker I do not know this idea would be helpful. I tested LightningChart control but it could not help me.

    Read the article

  • How to persist a cookie?

    - by user246114
    Hi, I am creating a cookie in a jsp script, which is located at: www.myproject.com/login/index.jsp if I restart the browser and navigate there, all works well, I can see the cookie persist. If I navigate to: www.myproject.com I am not seeing the cookie. Do I need to set something in the cookie path or domain to make the cookie visible to the entire [myproject.com] domain (I just want to access the cookie from whatever sub path the user may be on). I am creating the cookie like: Cookie c = new Cookie("thisisatest", "foo"); c.setMaxAge(60 * 24 * 3600); response.addCookie(c); Thanks

    Read the article

  • Aliasing localhost in Tomcat

    - by Saurabh
    I have a web application "quicker" deployed in Tomcat5.5 version. Usually, I run this application by using url, localhost/quicker and it loads the index.jsp file which is the home page of application. I would like same application to be run using url, www.local.dev.mydomain.com/quicker. Would it be possible by doing some configuration in server.xml or some other place? One way of doing this is could be, modify hosts file in windows as - # 127.0.0.1 localhost 127.0.0.1 www.local.dev.mydomain.com But, I want to this in tomcat manner if possible.

    Read the article

  • Sharing folders with VirtualBox, Win7 Host and Ubuntu 9.10 Guest [closed]

    - by user225890
    I have a development setup from this tutorial: http://www.sitepoint.com/blogs/2009/10/27/build-your-own-dev-server-with-virtualbox/ But what I can't figure out is how to share a folder on my Ubuntu virtualized machine with the host Win7. I want to use a Windows text editor to edit code that's on my Ubuntu server. I've tried using the Shared Folders setting, adding "/var/www" but it says that the path is not absolute. When I click on "other", it only allows me to browse folders on my Win7 host. Both the host and guest are 64-bit OS. Thanks in advance!

    Read the article

  • When is a program limited by the memory bandwidth?

    - by hanno
    I want to know if a program that I am using and which requires a lot of memory is limited by the memory bandwidth. When do you expect this to happen? Did it ever happen to you in a real life scenario? I found several articles discussing this issue, including http://www.cs.virginia.edu/~mccalpin/papers/bandwidth/node12.html http://www.cs.virginia.edu/~mccalpin/papers/bandwidth/node13.html http://ispass.org/ucas5/session2_3_ibm.pdf The first link is a bit old, but suggests that you need to perform less than about 1-40 floating point operations per floating point variable in order to see this effect (correct me if I'm wrong). How can I measure the memory bandwidth that a given program is using and how do I measure the (peak) bandwidth that my system can offer? I don't want to discuss any complicated cache issues here. I'm only interested in the communication between the CPU and the memory.

    Read the article

  • Download content of the page using ajax jquery

    - by niao
    Greetings, how can I download some page content using ajax and jquery: I am doing something like that (2 versions inside one script): $("p").click(function() { $('#result').load('http://google.com'); $.ajax({ url='www.google.com', success: function(data) { $("result").html(data); alert('Load was performed.'); var url = 'www.wp.pl'; $('div#result').load(url); //var content = $.load(url); //alert(content); //$("#result").html("test"); } }); }); but it does not return any content

    Read the article

  • Add two webview in framelayout

    - by user1478916
    I want to add two webview in a layout..I use frameLayout <?xml version="1.0" encoding="utf-8"?> <FrameLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="fill_parent" android:layout_height="fill_parent" android:orientation="vertical" > <WebView android:id="@+id/webview1" android:layout_width="350dip" android:layout_height="350dip" /> <WebView android:layout_height="250dip" android:layout_width="250dip" android:id="@+id/webview2" /> </FrameLayout> And in Main Activity : web1=(WebView)findViewById(R.id.webview1); web2=(WebView)findViewById(R.id.webview2); web1.loadUrl("http://www.google.com"); web2.loadUrl("http://www.youtube.com"); web2.setOnClickListener(new OnClickListener() { public void onClick(View v) { // TODO Auto-generated method stub Animation anim=AnimationUtils.loadAnimation(FrameWebViewActivity.this, android.R.anim.slide_in_left); web2.setAnimation(anim); } }); But when run project,it only display webview youtube full screen ..I want to display both two webview..What i must do??

    Read the article

  • Sending a file from memory (rather than disk) over HTTP using libcurl

    - by cinek1lol
    Hi! I would like to send pictures via a program written in C + +. - OK WinExec("C:\\curl\\curl.exe -H Expect: -F \"fileupload=@C:\\curl\\ok.jpg\" -F \"xml=yes\" -# \"http://www.imageshack.us/index.php\" -o data.txt -A \"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.1) Gecko/20061204 Firefox/2.0.0.1\" -e \"http://www.imageshack.us\"", NULL); It works, but I would like to send the pictures from pre-loaded carrier to a variable char (you know what I mean? First off, I load the pictures into a variable and then send the variable), cause now I have to specify the path of the picture on a disk. I wanted to write this program in c++ by using the curl library, not through exe. extension. I have also found such a program (which has been modified by me a bit) #include <stdio.h> #include <string.h> #include <iostream> #include <curl/curl.h> #include <curl/types.h> #include <curl/easy.h> int main(int argc, char *argv[]) { CURL *curl; CURLcode res; struct curl_httppost *formpost=NULL; struct curl_httppost *lastptr=NULL; struct curl_slist *headerlist=NULL; static const char buf[] = "Expect:"; curl_global_init(CURL_GLOBAL_ALL); /* Fill in the file upload field */ curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "send", CURLFORM_FILE, "nowy.jpg", CURLFORM_END); curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "nowy.jpg", CURLFORM_COPYCONTENTS, "nowy.jpg", CURLFORM_END); curl_formadd(&formpost, &lastptr, CURLFORM_COPYNAME, "submit", CURLFORM_COPYCONTENTS, "send", CURLFORM_END); curl = curl_easy_init(); headerlist = curl_slist_append(headerlist, buf); if(curl) { curl_easy_setopt(curl, CURLOPT_URL, "http://www.imageshack.us/index.php"); if ( (argc == 2) && (!strcmp(argv[1], "xml=yes")) ) curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headerlist); curl_easy_setopt(curl, CURLOPT_HTTPPOST, formpost); res = curl_easy_perform(curl); curl_easy_cleanup(curl); curl_formfree(formpost); curl_slist_free_all (headerlist); } system("pause"); return 0; }

    Read the article

  • Removing the default <a href> action

    - by user1285198
    Is there any way to stop the page from loading the next page when someone clicks on a <a> tag instead i want it to give the href value so for example "www.google.com" and then do a jquery .load() which is like this $(".window").load(hrefvalue); i know i could just change all the href values to fire up a javascript function but that takes a bit of time so im just looking for the easiest way. so i want it to do stop the page loading the next part on a <a href click. get the href value e.g http://www.google.com. and then do a jquery (.load() or $.ajax()) call for the href page.

    Read the article

  • how to make PHP lists all Linux Users?

    - by Data-Base
    Hello I want to build a php based site that (automate) some commands on my Ubuntu Server first thing I did was going to the file (sudoers) and add the user www-data so I can execute php commands with root privileges! # running the web apps with root power!!! www-data ALL=(ALL) NOPASSWD: ALL then my PHP code was <?php $command = "cat /etc/passwd | cut -d\":\" -f1"; echo 'running the command: <b>'.$command."</b><br />"; echo exec($command); ?> it returns only one user (the last user) !!! how to make it return all users? thank you

    Read the article

  • case sensititivity with users controller on certain hosting

    - by Leo
    We generally use two different hosting services. On one, everything works ticketyboo, as it does on my local dev servers. On the other server, however, I am having this problem: I can't access the users controller like this: http://www.example.com/users/login But I can like this: http://www.example.com/Users/login ** note the capitalised 'Users' ** If I displace the application to a sub-folder everything works fine (both upper- and lowercase). The hosting company have looked at it and can't see a problem at their end and they assure me that users is not a reserved word. You might say this isn't a problem, just use the version that works. Unfortunately it leads to problems downstream where Cake core starts generating urls itself. Anybody else seen this problem or know the solution? [This only occurs on the users controller - all others work as expected]

    Read the article

  • Drupal 7 - change index.php as a default main page

    - by reycoy
    I'm trying to work on my new website Drupal7 while I'm keeping in the hosting my old static website. So I have: index.html (static website. I want to keep for my users) index.php (drupal new installation) Is there a way to set in drupal this kind of situation? The problem, as you problably can see at this point, is that every time I'm going to a different section I'm being redirected to the .html homepage. I mean for example the admin section is: www.mydomain.com/?q=admin instead of www.mydomain.com/index.php?q=admin What's the easiest way to keep the html static web for my current users while I'm developing my drupal7 website in the "background"?

    Read the article

  • gauge chart is not displaying any thing

    - by Sandy
    i am trying to display the latest speed in mysql database on guage chart. i have tried so many things but gauge is not display plz any can help me...my code is attached and php part shows the correct value but dont know why guage is not display <?php $host="localhost"; // Host name $username="root"; // Mysql username $password=""; // Mysql password $db_name="mysql"; // Database name $tbl_name="gpsdb"; // Table name // Connect to server and select database. $con=mysql_connect("$host", "$username")or die("cannot connect"); mysql_select_db("$db_name")or die("cannot select DB"); $data = mysql_query("SELECT speed FROM gpsdb WHERE DeviceId=1234 ORDER BY TIME DESC LIMIT 1") or die(mysql_error()); while ($nt = mysql_fetch_assoc($data)) { $speed = $nt['speed']; $jsonTable = json_encode($speed); echo $jsonTable; } ?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="content-type" content="text/html; charset=utf-8"/> <title> Google Visualization API Sample </title> <script type="text/javascript" src="//www.google.com/jsapi"></script> <script type="text/javascript"> google.load('visualization', '1', {packages: ['gauge']}); </script> <script type="text/javascript"> function drawVisualization() { // Create and populate the data table. var data = new google.visualization.DataTable(<?=$speed?>); // Create and draw the visualization. new google.visualization.Gauge(document.getElementById('visualization')). draw(data); } google.setOnLoadCallback(drawVisualization); </script> </head> <body style="font-family: Arial;border: 0 none;"> <div id="visualization" style="width: 600px; height: 300px;"></div> </body> </html>

    Read the article

  • Apache - Restrict to IP not working.

    - by Probocop
    Hi, I've a subdomain that I only want to be accessible internally; I'm trying to achieve this in Apache by editing the VirtualHost block for that domain. Can anybody see where I'm going wrong? Note, my internal IP address here are 192.168.10.xxx. My code is as follows: <VirtualHost *:80> ServerName test.epiphanydev2.co.uk DocumentRoot /var/www/test ErrorLog /var/log/apache2/error_test_co_uk.log LogLevel warn CustomLog /var/log/apache2/access_test_co_uk.log combined <Directory /var/www/test> Order allow,deny Allow from 192.168.10.0/24 Allow from 127 </Directory> </VirtualHost> Thanks

    Read the article

  • Deny access to a PHP file (Nginx)

    - by Desmond Hume
    I want Nginx to deny access to a specific PHP file, let's call it donotexposeme.php, but it doesn't seem to work, the PHP script is run as usual. Here is what I have in the config file: location / { root /var/www/public_html; index index.php; } location ~ \.php$ { fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME /var/www/public_html$fastcgi_script_name; include fastcgi_params; } location /donotexposeme.php { deny all; } Of course, I do sudo service nginx reload (or restart) each time I edit the config.

    Read the article

  • jQuery: Replace strings with .each()

    - by Warrantica
    I want a function that replace each li with an image. This is my code: $(document).ready(function(){ var tmphref; var tmpname; var str = '<a href="' + tmphref + '"><img src="http://www.somesite.com/a/' + tmpname[1] + '/avatar-small.jpg /></a>'; $('#somediv li a').each(function(){ tmphref = $(this).attr("href"); tmpname = /http\:\/\/(\w+)\.somesite\.com\//.exec(tmphref); $(this).parent().replaceWith(str); }); }); The image is in this specific path: www.somesite.com/a/username/avatar-small.jpg The code above doesn't work. Any ideas? Thank you in advance.

    Read the article

  • can anyone help me with this javascript for validating url in aspx?

    - by orrep
    heres my code - function Validate_URL(url) { var iurl = url.value; var v = new RegExp(); v.compile("/^(((ht|f){1}(tp:[/][/]){1})|((www.){1}))[-a-zA-Z0-9@:%_+.~#?&//=]+$/;"); if (!v.test(iurl.value)) { url.style.backgroundColor = 'yellow'; } return true; } no matter what i put in url, say http://www.abc.com/newpage.html, it returns false. how come?

    Read the article

  • Data extract from website URL

    - by user2522395
    From this below script I am able to extract all links of particular website, But i need to know how I can generate data from extracted links especially like eMail, Phone number if its there Please help how i will modify the existing script and get the result or if you have full sample script please provide me. Private Sub btnGo_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles btnGo.Click 'url must be in this format: http://www.example.com/ Dim aList As ArrayList = Spider("http://www.qatarliving.com", 1) For Each url As String In aList lstUrls.Items.Add(url) Next End Sub Private Function Spider(ByVal url As String, ByVal depth As Integer) As ArrayList 'aReturn is used to hold the list of urls Dim aReturn As New ArrayList 'aStart is used to hold the new urls to be checked Dim aStart As ArrayList = GrabUrls(url) 'temp array to hold data being passed to new arrays Dim aTemp As ArrayList 'aNew is used to hold new urls before being passed to aStart Dim aNew As New ArrayList 'add the first batch of urls aReturn.AddRange(aStart) 'if depth is 0 then only return 1 page If depth < 1 Then Return aReturn 'loops through the levels of urls For i = 1 To depth 'grabs the urls from each url in aStart For Each tUrl As String In aStart 'grabs the urls and returns non-duplicates aTemp = GrabUrls(tUrl, aReturn, aNew) 'add the urls to be check to aNew aNew.AddRange(aTemp) Next 'swap urls to aStart to be checked aStart = aNew 'add the urls to the main list aReturn.AddRange(aNew) 'clear the temp array aNew = New ArrayList Next Return aReturn End Function Private Overloads Function GrabUrls(ByVal url As String) As ArrayList 'will hold the urls to be returned Dim aReturn As New ArrayList Try 'regex string used: thanks google Dim strRegex As String = "<a.*?href=""(.*?)"".*?>(.*?)</a>" 'i used a webclient to get the source 'web requests might be faster Dim wc As New WebClient 'put the source into a string Dim strSource As String = wc.DownloadString(url) Dim HrefRegex As New Regex(strRegex, RegexOptions.IgnoreCase Or RegexOptions.Compiled) 'parse the urls from the source Dim HrefMatch As Match = HrefRegex.Match(strSource) 'used later to get the base domain without subdirectories or pages Dim BaseUrl As New Uri(url) 'while there are urls While HrefMatch.Success = True 'loop through the matches Dim sUrl As String = HrefMatch.Groups(1).Value 'if it's a page or sub directory with no base url (domain) If Not sUrl.Contains("http://") AndAlso Not sUrl.Contains("www") Then 'add the domain plus the page Dim tURi As New Uri(BaseUrl, sUrl) sUrl = tURi.ToString End If 'if it's not already in the list then add it If Not aReturn.Contains(sUrl) Then aReturn.Add(sUrl) 'go to the next url HrefMatch = HrefMatch.NextMatch End While Catch ex As Exception 'catch ex here. I left it blank while debugging End Try Return aReturn End Function Private Overloads Function GrabUrls(ByVal url As String, ByRef aReturn As ArrayList, ByRef aNew As ArrayList) As ArrayList 'overloads function to check duplicates in aNew and aReturn 'temp url arraylist Dim tUrls As ArrayList = GrabUrls(url) 'used to return the list Dim tReturn As New ArrayList 'check each item to see if it exists, so not to grab the urls again For Each item As String In tUrls If Not aReturn.Contains(item) AndAlso Not aNew.Contains(item) Then tReturn.Add(item) End If Next Return tReturn End Function

    Read the article

< Previous Page | 228 229 230 231 232 233 234 235 236 237 238 239  | Next Page >