Search Results

Search found 20795 results on 832 pages for 'personal website'.

Page 7/832 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • Micro Focus lance Enterprise Developer Personal Edition, un outil gratuit pour le développement d'applications mainframe IBM

    Micro Focus lance Enterprise Developer Personal Edition Un outil gratuit pour le développement d'applications mainframe IBM Micro Focus, l'éditeur de solutions de gestion, de test et de modernisation d'applications d'entreprise, vient de sortir son IDE gratuit Enterprise Developer Personal Edition destiné aux développeurs professionnels et aux étudiants en informatique pour les applications mainframe IBM. Il s'agit en fait de la version d'entrée de gamme, très facile à utiliser, de la solution complète Micro Focus Enterprise Developer. Elle s'intègre à Eclipse ou Visual Studio pour en faire des outils de développement d'applications mainframe distribuées. Cette éditi...

    Read the article

  • C# .Net Utility suggestion for personal computer laptop

    - by alliswell
    Hi all, I want know what are utilities you have created for your personal computer or laptop for day to day purpose. Like few may have created task manager or windows service for scheduler, or tool to get latest feeds from SO. Need your experiences, which made your day to day task easier. And I don't want to know any third party(except Commercial) tools. I will not commercialize this ;-), but I want to know how I can utilize my skills to create application for personal use.

    Read the article

  • IIS 7 Cannot Access Website

    - by UberError
    We can access our website from other systems, but when logged into the local machine where the site is hosted it does not resolve to the site... for example: http://mysite.com/folder/page.aspx gets a 404.... from the local machine we also cannot ping mysite.com ... What are some common things to check to troubleshoot this issue? I'm new to IIS 7 so sorry for the vague question.

    Read the article

  • Can't remember the website for downloading common tools after reinstall

    - by JB
    I remember a while back finding a website for downloading lots of common tools in one go after a system reinstall. It had a dark background and checkboxes for selecting the tools (browsers, editors, readers, im clients, etc.). After selecting the tools it downloaded a file which went away, downloaded everything you'd selected and then performed an install on each of the different apps. Does anyone have any idea what I'm talking about?

    Read the article

  • Best Photo Sharing Website for Teams/Organizations [closed]

    - by Patrick Cuff
    I manage the website for my daughter's U-10 soccer team. I'd like to set up a place where team members can both view and upload pictures. It needs to be secure, so that only authorized/registered members can view and upload pics. I've looked into Picasa and Flickr, and while both are good for sharing photos, only one registered user can upload. Are there any good sites that allow multiple people to upload photos?

    Read the article

  • How to make a maps website?

    - by Eias.N
    I want to build a website that provides some maps that have information about a city (like Google Maps). How can I do that? Is it good to use an app to do that? Can you give me a few examples? Or it it best to do that with Web programming? Is there any tutorial that can help me out?

    Read the article

  • How to send credentials to linkedIn website and get oauth_verifier without signing in again [closed]

    - by akash kumar
    I am facing a problem sending credentials to another website so that I can login the user (automatically, not clicked on sign in here) and get an oauth_verifier value. I want to send the email address and the password through a form (submit button) from my website (e.g. a Liferay portal) to another website (e.g. LinkedIn), so that it automatically returns an oauth_verifier to my website. That means I don't want the user of my website to submit his email and password to LinkedIn again. My goal is to take the email and password of the user in my website and show the user his LinkedIn connection, message, job posting (again, in my website, not LinkedIn). I dont want the user redirected to the LinkedIn website to sign in there and then come back to my website. I have taken a consumer key and a secret key from LinkedIn for my web aplication. I am using the LinkedIn API and getting oauth_verifier for access token but in order to login, I have to take user to LinkedIn to sign in, while I want it to happen in the backend.

    Read the article

  • sending credential to linkedIn website and get oauth_verifier without sign in again

    - by akash kumar
    i am facing problem regarding sending credentials to other website and after login(automatically not clicked on sign in here) and get oauth_verifier value Detail is bellow.... I want to send emailaddress and password through form(submit button)from my website(i.e liferay portal) to another website(suppose linkedIn) it should automatically authorize and return oauth_verifier to my website. that mean i dont want my website user to submit emailaddress and password to linkedIn again. actually i want to take emailaddress and password in my website and show the user LinkedIn connection,message,job posting in my website it self,i dont want to redirect user to LinkedIn website and sign in there and again come back to my website. I have taken consumer key and secret key from LinkedIn for my aplication. i am using linkedIn api and getting oauth_verifier for access token but for that i have to take user to LinkedIn for signIn, actually it should happen in backend

    Read the article

  • Virtualhost entries gets over-written when apache httpd.conf is rebuilt

    - by Amitabh
    Background: We have been trying to get a wildcard SSL working on multiple sub domains on a single dedicated address.. We have two sub domains next.my-personal-website.com and blog.my-personal-website.com Part of our strategy has been to edit the httpd.conf and add the NameVirtualHost xx.xx.144.72:443 directive and the virtualhost entries for port 443 for the subdomains there. This works good if we just edit the httpd.conf, add the entries, save it and restart the apache. The problem: But if we add a new sub domain from cpanel or we run the # /usr/local/cpanel/bin/apache_conf_distiller --update # /scripts/rebuildhttpdconf the virtualhost entries that we added manually are no more there in the newly generated httpd.conf file. Only the virtualhost entry for the main domain for port 443 that was there before we made edits to the httpd.conf is there(assuming we are not discussing virtualhost entries for port 80). I understand we need to put the new virtualhost entries in some include files as mentioned here in the cpanel documentation. But am not sure where to. So the question would be where do I put the NameVirtualHost xx.xx.144.72:443 directive and the two virtualhost directive for port 443, so that they are not overwritten when httpd.conf is rebuilt/regenerated later. Virtualhost entries: The two virtualhost entries for the subdomains are: <VirtualHost xx.xx.144.72:443> ServerName next.my-personal-website.com ServerAlias www.next.my-personal-website.com DocumentRoot /home/myguardi/public_html/next.my-personal-website.com ServerAdmin [email protected] UseCanonicalName On CustomLog /usr/local/apache/domlogs/next.my-personal-website.com combined CustomLog /usr/local/apache/domlogs/next.my-personal-website.com-bytes_log "%{%s}t %I .\n%{%s}t %O ." ## User myguardi # Needed for Cpanel::ApacheConf <IfModule mod_suphp.c> suPHP_UserGroup myguardi myguardi </IfModule> <IfModule !mod_disable_suexec.c> SuexecUserGroup myguardi myguardi </IfModule> ScriptAlias /cgi-bin/ /home/myguardi/public_html/next.my-personal-website.com/cgi-bin/ SSLEngine on SSLCertificateFile /etc/ssl/certs/my-personal-website.com.crt SSLCertificateKeyFile /etc/ssl/private/my-personal-website.com.key SSLCACertificateFile /etc/ssl/certs/my-personal-website.com.cabundle CustomLog /usr/local/apache/domlogs/next.my-personal-website.com-ssl_log combined SetEnvIf User-Agent ".*MSIE.*" nokeepalive ssl-unclean-shutdown <Directory "/home/myguardi/public_html/cgi-bin"> SSLOptions +StdEnvVars </Directory> and <VirtualHost xx.xx.144.72:443> ServerName blog.my-personal-website.com ServerAlias www.blog.my-personal-website.com DocumentRoot /home/myguardi/public_html/blog.my-personal-website.com ServerAdmin [email protected] UseCanonicalName On CustomLog /usr/local/apache/domlogs/blog.my-personal-website.com combined CustomLog /usr/local/apache/domlogs/blog.my-personal-website.com-bytes_log "%{%s}t %I .\n%{%s}t %O ." ## User myguardi # Needed for Cpanel::ApacheConf <IfModule mod_suphp.c> suPHP_UserGroup myguardi myguardi </IfModule> <IfModule !mod_disable_suexec.c> SuexecUserGroup myguardi myguardi </IfModule> ScriptAlias /cgi-bin/ /home/myguardi/public_html/blog.my-personal-website.com/cgi-bin/ SSLEngine on SSLCertificateFile /etc/ssl/certs/my-personal-website.com.crt SSLCertificateKeyFile /etc/ssl/private/my-personal-website.com.key SSLCACertificateFile /etc/ssl/certs/my-personal-website.com.cabundle CustomLog /usr/local/apache/domlogs/blog.my-personal-website.com-ssl_log combined SetEnvIf User-Agent ".*MSIE.*" nokeepalive ssl-unclean-shutdown <Directory "/home/myguardi/public_html/cgi-bin"> SSLOptions +StdEnvVars </Directory> and the automatically generated virtualhost entry for the main domain for port 443 is <VirtualHost xx.xx.144.72:443> ServerName my-personal-website.com ServerAlias www.my-personal-website.com DocumentRoot /home/myguardi/public_html ServerAdmin [email protected] UseCanonicalName Off CustomLog /usr/local/apache/domlogs/my-personal-website.com combined CustomLog /usr/local/apache/domlogs/my-personal-website.com-bytes_log "%{%s}t %I .\n%{%s}t %O ." ## User myguardi # Needed for Cpanel::ApacheConf <IfModule mod_suphp.c> suPHP_UserGroup myguardi myguardi </IfModule> <IfModule !mod_disable_suexec.c> SuexecUserGroup myguardi myguardi </IfModule> ScriptAlias /cgi-bin/ /home/myguardi/public_html/cgi-bin/ SSLEngine on SSLCertificateFile /etc/ssl/certs/my-personal-website.com.crt SSLCertificateKeyFile /etc/ssl/private/my-personal-website.com.key SSLCACertificateFile /etc/ssl/certs/my-personal-website.com.cabundle CustomLog /usr/local/apache/domlogs/my-personal-website.com-ssl_log combined SetEnvIf User-Agent ".*MSIE.*" nokeepalive ssl-unclean-shutdown <Directory "/home/myguardi/public_html/cgi-bin"> SSLOptions +StdEnvVars </Directory> # To customize this VirtualHost use an include file at the following location # Include "/usr/local/apache/conf/userdata/ssl/2/myguardi/my-personal-website.com/*.conf" I really appreciate if somebody can tell me how to proceed on this. Thank you. Update: Include directives present are: `Include "/usr/local/apache/conf/includes/pre_main_global.conf" Include "/usr/local/apache/conf/includes/pre_main_2.conf" Include "/usr/local/apache/conf/php.conf" Include "/usr/local/apache/conf/includes/errordocument.conf" Include "/usr/local/apache/conf/modsec2.conf" Include "/usr/local/apache/conf/includes/pre_virtualhost_global.conf" Include "/usr/local/apache/conf/includes/pre_virtualhost_2.conf" ` These are the entries that are generated before any virtualhost entry is defined. Towards the end of the httpd.conf file , the following two entries are added Include "/usr/local/apache/conf/includes/post_virtualhost_global.conf" Include "/usr/local/apache/conf/includes/post_virtualhost_2.conf" The older httpd.conf file before we added the virtualhost entries for sub domains for port 443 can be viewed here

    Read the article

  • Ruby on Rails website hosting

    - by sfactor
    i want to start a website. it'll be a small community based website. i've learned a fair bit of ruby on rails and am planning to use it. however, i have never deployed a production website before. i've just practiced in my local computer. i wanted to know what are the things i need to deploy the website on the internet. what is the best place to get a domain name and web hosting, esp for ruby on rails sites. how are cloud based services like amazon EC2 etc different from a traditional web host. which is a better choice. what else might i need to do to deploy a website. also i may happen to have a fair bit of users in the future. so how to go about planning for scalability issues. how to sites like twitter, fmylife.com etc all go about these things.

    Read the article

  • Charging for the creation of a website [closed]

    - by mattgcon
    I am not sure if this can be asked here, but you all are very reliable and trustworthy in my eyes and well straight to the point, so I am going to ask. I am a professional programmer and web designer for UCLA (do not set prices) and I have an old colleage that needs a website designed for his new company. He wants his website to resemble the LDH Energy website, layout colors and the flash movies (no image replication or verbiage, all that will be his) and he has one idea of a price and I have another. Can anyone go and look at that website and let me know their opinion of what to charge for his website? thank you matthew

    Read the article

  • Using Website Information Without WebView

    - by Mr. Monkey
    I am very new to this, and I more looking for what information I need to study to be able to accomplish this. What I want to do is use my GUI I have built for my app, but pull the information from a website. If I have a website that looks like this: (Sorry, can't post pics yet) http:// dl.dropbox.com/u/7037695/ErrorCodeApp/FromWebsite.PNG (full website can be seen at http://www.atmequipment.com/Error-Codes) What would I need from the website so that if a user entered an error code here: http:// dl.dropbox.com/u/7037695/ErrorCodeApp/InApp.PNG It would use the search from the website, and populate the error description in my app? I know this is a huge question, I'm just looking for what is actually needed to accomplish this, and then I can start researching from there. -- Or is it even possible?

    Read the article

  • How secure is my website?

    - by Doug
    As a beginning web developer, I try my best to clean up all the user inputs through checks and what not. However, today, I found out my website was hacked (I'll share their website on request) and it really made my wonder how did they do it. I'm in the process of getting my website back together. What should I do to prevent these things? Is there people I should talk to and ask how secure my website is? What can I do to to keep my website safe?

    Read the article

  • Add an existing ASP.NET Website to Subversion using AnkhSVN/Tortoise

    - by EasyDot
    How do you add a existing ASP.NET website to Subversion dealing with the problems that Subversion dosent support multiple folder structures in the repository: An default ASP.NET Website Solution folder structure look like this: C:\Documents and Settings\UserName\My Documents\Visual Studio 2008\Projects\WebSite1\ WebSite1.sln WebSite1.suo C:\Documents and Settings\UserName\My Documents\Visual Studio 2008\WebSites\WebSite1\ App_Data Default.aspx web.config How do i import the website to the repository? How do i get working copys of the website from the repository? How do i branch the website? How do i merge the websitebranch into the trunk?

    Read the article

  • Recommended website performance monitoring services? [closed]

    - by Dennis G.
    I'm looking for a good performance monitoring service for websites. I know about some of the available general monitoring services that check for uptime and notify you about unavailable services. But I'm specifically looking for a service with an emphasis on performance. I.e., I would like to see reports with detailed performance statistics from multiple locations world-wide, with a break-down on how long it took to fetch the different website resources, including third-party scripts such as Google Analytics and so on (the report should contain similar details such as the FireBug Net tab). Are there any such services and if so, which one is the best?

    Read the article

  • Website crawler/spider to get site map

    - by ack__
    I need to retrieve a whole website map, in a format like : http://example.org/ http://example.org/product/ http://example.org/service/ http://example.org/about/ http://example.org/product/viewproduct/ I need it to be linked-based (no file or dir brute-force), like : parse homepage - retrieve all links - explore them - retrieve links, ... And I also need the ability to detect if a page is a "template" to not retrieve all of the "child-pages". For example if the following links are found : http://example.org/product/viewproduct?id=1 http://example.org/product/viewproduct?id=2 http://example.org/product/viewproduct?id=3 I need to get only once the http://example.org/product/viewproduct I've looked into HTTtracks, wget (with spider-option), but nothing conclusive so far. The soft/tool should be downloadable, and I prefer if it runs on Linux. It can be written in any language. Thanks

    Read the article

  • Download a website that requires log-in with HTTtrack Copier

    - by H.Moss
    Hi guys! I have been researching of how to download content of a site that requires username and password. This is actually harder than I thought it would be. I tried to use both HTTtrack Copier and followed the instruction below, but it's not working! Q: I can not access several pages (access forbidden, or redirect to another location), but I can with my browser, what's going on? A: You may need cookies! Cookies are specific data (for example, your username or password) that are sent to your browser once you have logged in certain sites so that you only have to log-in once. For example, after having entered your username in a website, you can view pages and articles, and the next time you will go to this site, you will not have to re-enter your username/password. To "merge" your personnal cookies to an HTTrack project, just copy the cookies.txt file from your Netscape folder (or the cookies located into the Temporary Internet Files folder for IE) into your project folder (or even the HTTrack folder)

    Read the article

  • Website: Requested filename being rewritten

    - by horatio
    I have been unable to find an answer via search. I have a website (I do not administer the servers) where the server will serve a different file than the one requested. I first noticed this when using a filename of the following form: _foo.php (single underscore) If I request foo.php (does not exist), the server returns _foo.php. By "returns" I mean that the server decides I meant _foo.php, processes the php file, and serves the output. If I request afoo.php, zfoo.php, or even __foo.php (two underscores) (these files do not exist) the server returns _foo.php. If I request aafoo.php, the server returns 404. To sum up: the server seems to be doing a partial filename match. My question is: what is happening and is this accepted behavior for a web server (or standard behavior of a common mod/package/etc)?

    Read the article

  • Recovering an old website

    - by noah
    I have a client with an old website that somebody setup for him long ago. The guy who set it up is unreachable, so how do we go about trying to take it over? A WHOIS lookup got us some contact information, but I don't have great hopes for that (it hasn't been update in quite some time). The nameservers are ns1.theplanet.com and ns2.theplanet.com, and we will try calling them, but I don't expect we'll be able to get much from them. What are our options? Is there a way I can discover the registrar so we can try contacting them as well? EDIT: It would be sufficient if we could get control of the domain name or put in some sort of redirect to the new site. Either hosting was prepaid for quite some time, or someone else is still paying for it, so we don't care about that.

    Read the article

  • Have Free website Hosting with Google App Engine

    - by mickthompson
    I'm reading about Google App Engine. I'm creating a bunch of simple dynamic websites in java. I'm considering to use Google App Engine and setup my clients' website on it. In this way I've only to register a domain www.myclietdomain.com and then point that to the GoogleAppEngine application... In this way I plan to avoid hosting costs. Infact I'm paying even for hosting few static html pages... Do you think that is possible to use Google App Engine for this scope?

    Read the article

  • Distributed website server redundancy

    - by Keith Lion
    Assume a website infrastructure is very complicated and is fully distributed (probably like most large web companies). Am I right in thinking that although there are all these extra web servers to handle multiple client requests, there is still a single "machine" whereby users must enter? I am guessing this machine will be the one physically associated to the IP address? I ask because I need to know whether, in places where distributed systems exist, there is still a single point of failure- usually the control node or, in this example, the machine connected to the public internet? Surely there cannot be two machines connected to the internet, as they would have to have different IP addresses? This "machine" may not be a server per se, but maybe it is a piece of cisco equipment. I just need to know whether, in the real world, these distributed systems still have a particular section where they depend on the integrity of one electronic device?

    Read the article

  • block access to certain website types

    - by frustrated teacher
    Need to block access to certain website types without listing each URL to block. Students at secondary school are going to porn sites. Need to be able to block all such access without having to list each possible site URL. Having the Content -- Ratings tab set to None for all categories on the ratings files listed on my computers does not prevent access. Unchecking users may access sites with no rating, even with the security settings set to High, still allows the porn sites to come up. If that is checked, then ONLY listed sites can open and students would not be able to do any research via google, for example. I would rather not have to continue checking each computer and blocking sites as they find them.

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >