Search Results

Search found 8284 results on 332 pages for 'trusted sites'.

Page 54/332 | < Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >

  • extra configuration needed after installing SSL certificate?

    - by ptriek
    We recently developed two rather simple PHP applications for AXA (European bank). URL's are axa.tfo.be/incentives/cipres and axa.tfo.be/incentives/zrkk (access to both sites is restricted to visitors with cookies with encrypted passwords) On a previous security audit by an external company several security issues have been found. All these issues have been solved by a collleague PHP developer. However, one last requirement has been added - all data should be transfered over https. My php collegue is on holiday, however - and unavailable at the moment. So I contacted my host, and asked for installing SSL certificate. I myself have no knowledge/experience with SSL, so I'm a bit at loss for the following problems. Comodo SSL certificate + unique IP address has been installed today by my webhost for subdomain axa.tfo.be (by www.combell.be). However, it doesn't seem to be working. I posted a question about this earlier today, and was told not to worry, see link: http://serverfault.com/questions/339320/what-happens-if-you-install-an-ssl-certificate Current problems: the web applications aren't accessible over https, http works though (if a valid cookie is available) there's a static html page at http://axa.tfo.be/incentives/cipres/static.html, even that page is only accessible over http My webhost is telling me that 'my application probably doesn't support SSL', and has asked me to set an SSL variable to true in my php code. So my questions: I have basic knowledge of php, but don't know where to start regarding the 'php ssl variable'. The sites have been online for some time, and have been developed for regular php access. (Google didn't bring me any help, either.) Can anyone point me in the right direction, or give me some clues about whether/what I should ask my webhost for further assistance? (I'm a bit on a tight schedule, the sites will be audited again on monday, and it's a customer i wouldn't want to loose...) Thanks for looking into this, and sorry if my questions sound a bit nooby - I'm a webdesigner, not a server specialist...

    Read the article

  • WebCenter Customer Advisory Board meetings kick off Oracle Open World 2012!

    - by Lance711
    Welcome to OpenWorld! OpenWorld 2012 got underway today with a series of meetings with the members of the WebCenter Customer Advisory Board. Led by the WebCenter Product Management team, these meetings are a great way for the product team and customers to directly interact and discuss real-life business challenges, product details and to discuss upcoming features and functionality. This year, board members participated in discussions around live demos around product enhancements that will be featured throughout the coming week. Highlights included a variety of new mobile and social solutions, a great new user interface for WebCenter Content plus new Portal and Sites functionality that makes the experience for the everyday user a lot more pleasant. The day kicked off with Roel Stalman, VP of Product Management, giving a detailed overview of what’s new in WebCenter. Given all the improvements to discuss, this session went over 2 hours! Roel showcased the brand new UI for Content, Portal and Sites. He also gave live demos of the new mobile apps for WebCenter Content, Portal and the Oracle Social Network.  The attendees then broke into sub-groups in order to deep-dive with Product Management for the Portal, Sites, and Content product areas on specific functionality and application integrations. If you are here in San Francisco this week for OpenWorld, I definitely recommend stopping by the WebCenter area in the Moscone West Exhibition Hall to see some of this new functionality for yourself. And be sure to check out the WebCenter sessions throughout the week as those give us a chance to discuss direction and strategy, answer your questions and get your feedback and ideas. For those of you could not make it to OpenWorld this year, we miss you! You can stay in touch with what is happening via this blog and by following #oow and #webcenter on Twitter. Additionally, we will be rolling out details on upcoming products and release info over the coming months via this blog and web seminars. Stay tuned!

    Read the article

  • Make Google Plus One only work for the domain and not the path [closed]

    - by Saeed Neamati
    Possible Duplicate: Make Google +1 button +1 a specific URL rather than the URL it's on? I'm creating an image sharing website, and since it's going to have many thousand links, then it's almost impossible to put a Google Plus One button in my site. Plus one is an indication of site's popularity and trust. You follow a link in SERP, because you see that somebody that you know has already plused one that link. So, you trust that link and click it. The more plus a page get, the more trustworthy it becomes. Sites which has simple static pages can get many plus ones, but sites like mine (dynamic sites with thousands of links) can't aggregate plus ones in one page. Is there any way to tell Google that I only want the Plus Ones to be counted for the domain only, and not for the path? In other words, how can I transfer a plus one given to the http://example.com/tag1-tag2/2525 to plus ones given to the http://example.com? Is it possible at all?

    Read the article

  • How to Use and Customize Google Chrome Web Apps

    - by The Geek
    Google announced their new Chrome Web Store today, with loads of web sites and games that can be installed as applications in your browser, synced across every PC, and customized to launch the way you want them to. Here’s how it all works. Note: this guide really isn’t aimed at expert geeks, though you’re more than welcome to leave your thoughts in the comments. What Are Chrome Web Apps Again? The new Chrome Web Apps are really nothing more than regular web sites, optimized for Chrome and then wrapped up with a pretty icon and installed in your browser. Some of these sites, especially web-based games, can also be purchased through the Chrome Web Store for a small fee, though the majority of services are free Latest Features How-To Geek ETC The How-To Geek Holiday Gift Guide (Geeky Stuff We Like) LCD? LED? Plasma? The How-To Geek Guide to HDTV Technology The How-To Geek Guide to Learning Photoshop, Part 8: Filters Improve Digital Photography by Calibrating Your Monitor Our Favorite Tech: What We’re Thankful For at How-To Geek The How-To Geek Guide to Learning Photoshop, Part 7: Design and Typography Fun and Colorful Firefox Theme for Windows 7 Happy Snow Bears Theme for Chrome and Iron [Holiday] Download Full Command and Conquer: Tiberian Sun Game for Free Scorched Cometary Planet Wallpaper Quick Fix: Add the RSS Button Back to the Firefox Awesome Bar Dropbox Desktop Client 1.0.0 RC for Windows, Linux, and Mac Released

    Read the article

  • Can I remove all-caps and shorten the disclaimer on my License?

    - by stefano palazzo
    I am using the MIT License for a particular piece of code. Now, this license has a big disclaimer in all-caps: THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF... ... I've seen a normally capitalised disclaimer on the zlib license (notice that it is above the license text), and even software with no disclaimer at all (which implies, i take it, that there is indeed a guarantee?), but i'd like some sourced advice by a trusted party. I just haven't found any. GNU's License notice for other files comes with this disclaimer: This file is offered as-is, without any warranty. Short and simple. My question therefore: Are there any trusted sources indicating that a short rather than long, and a normally spelled rather than capitalised disclaimer (or even one or the other) are safely usable in all of the jurisdictions I should be concerned with? If the answer turns out to be yes: Why not simply use the short license notice that the fsf proposes for readme-files and short help documents instead of the MIT License? Is there any evidence suggesting this short 'license' will not hold up? For the purposes of this question, the software is released in the European Union, should it make any difference.

    Read the article

  • SSL Certificate Works in Monit - But Not in Keystore

    - by Bart Silverstrim
    I have a situation where there's a keystore file with the various root/intermediate certificates stored in it in a way that it seems to work for most browsers. Problem is that when mobile browsers hit it, there's a break in the chain and they complain. I used an SSL checker at http://www.sslshopper.com/ssl-checker.html and it states that "The certificate is not trusted in all web browsers. You may need to install an Intermediate/chain certificate to link it to a trusted root certificate." So...the desktop browsers must have the intermediate certs already and can make the chain connections, I'm assuming, while the mobile browsers can't. The thing is that I had used Portecle to export certificates from the keystore and cobble them together to create a .PEM certificate to run the Monit utility. When I check that application with the SSL checker, it works fine! The person that originally created the keystore said he couldn't follow the SSL provider's directions for creating the keystore because he created the CSR request using openssl, so the cert and private key had to be converted to DER format and use importkey to get it to work; following the directions he found online had importkey seem to use only a set keystore file as a result, and it would erase anything already in the file if it existed. So is there a way to take the certificate I created for Monit and create a working keystore for the Tomcat website? What would be causing the chain to be broken in the current keystore, but work for Monit? I have the SSL cert provider's intermediate and cross certificates, and the website's certificate, but is what else would I need to create a working chain of certs for a keystore?

    Read the article

  • Presenting at VS Live! Orlando in December

    - by Steve Michelotti
    I’ll be presenting at VS Live! December 10-14 in Orlando, FL. I’ll be presenting Azure Web Sites. This is the session abstract: Azure Web Sites brings a whole new level of power and simplicity to cloud computing. This demo-heavy session will show numerous features that allow you to deploy your site in a matter of seconds. Whether you are building a completely custom app or deploying from one of the numerous templates provided (such as WordPress), you’ll be up and running in no time. Want to use Node.js or PHP and deploy from Git? No problem! Azure Web Sites gives you the power of elastic scaling while still providing streamlined development and an effortless deployment experience. This presentation will also cover features including monitoring, custom domains, working with SQL databases or more!   SPECIAL OFFER: As a speaker, I can extend $500 savings on the 5-day package. Register here: http://bit.ly/VOSPK19Reg  and use code VOSPK19. The great part about Visual Studio Live!: four events in one! This year, the event will be co-located with SQL Server Live!, SharePoint Live!, and Cloud & Virtualization Live!. You can customize your conference agenda and attend ANY sessions from all four events. Register now: http://bit.ly/VOSPK19Reg

    Read the article

  • Share on: FB, Tweet, Digg, Linkedin, Delicious, My mother, ... it's just on fashion, or some real value?

    - by Marco Demaio
    Nowadays your site is not in fashion if you don't show at least a couple of share buttons like these: Is this just fashion, or do people actually get something good out of it? When I say "something good" I mostly mean something that you could measure, and not just the feeling that was good. Maybe I can better explain with an example: did you notice (in some way) that many people clicked on those links to share your page/s on those web 2.0 social sites? And in such a case on which social networks did you see they mostly share your pages? BTW I'm not talking about Google PR, i know all web 2.0 social sites use nofollow everywhere and even hidden links, so they are useless by themselves for PR. UPDATE: According to this video, Google's Alter Ego says that they now use in some way data from social sites in ranking. If this is true, it's obvious that the Share on button for FB, Tweet, etc are definitely of some values. But again my question is more about what you noticed in your real experience to be a direct benefit of adding those type of "Share On" links on your webisite? I.e. did you see more traffic coming in form FB, or some users who bought your products because of FB or Twitter? Or any other benefits? Thanks

    Read the article

  • Combining Code Review with Trust Metrics

    - by DragonFax
    I don't get the chance to partake of it at work. But I love the idea of code review. Especially of online open source code review like Gerrit Code Review. I love what Trust Metrics have done for forums and collective intelligences sites on the internet like stackexchange, reddit, and wikipedia. Would it be possible to combine the two and come up with an open source project management system. Something that ends up being mostly community driven. Perhaps a kind of wikipedia of code for a project. Where submitters become popular/trusted by having lots of patches reviewed favoriably by others, and accepted into the trunk. And popular/trusted submitters get their patchs accepted faster/easier. I'm looking for some opinions on the idea, or perhaps pointers to where its been done before, if thats the case. This might leave the lead maintiner little more to do than: wrangle the direction of the project by fast-tracking or vetoing specific patches. settling disputes when the CI tests break, or fixing it himself. Is design by community worse than design by committee?

    Read the article

  • Authentication system brainstorm

    - by gansbrest
    Hi. We got multiple small websites (microsites) and one main high traffic one with big users base. Right now the requirement is to build authentication system which should allow users to loign with the same identity across the network. All website are running on different domains, powered by Drupal 6 CMS and have separate databases (so sharing tables with prefix is not an option + it creates a huge mess in the db). Here is the set of core requirements I came up with: Users should be able to login with the same credentials to all sites within the network User’s data sharing between Main site (storage) and all micro sites within the network Data synchronization across the network when user changes the data (update email or password for example) The login/registration process should be seamless and consistent Register on any of the sites across the network and use that identity to login later on. In the future there might be a need to add openid authentication options. Basically we are looking at something similar stackexchange does, but not sure if they have central users base on not. I was thinking about custom solution which will include 2 parts (modules), one will be stored on the Main site for users data storing and responding to requests from clients. Second part (module) will be placed on each microsite, which is going to send requests to the Master. Some kind of client - server setup. One of the complications I see right away is #3. Data Synhcronization across the network. I just don't want to reinvent the wheel and maybe some work is already done in this direction. Looking forward to your ideas on how to approach this project. EDIT: We use MySQL database

    Read the article

  • Oracle Endeca Information Discovery 3.1 is Now Available

    - by p.anda
    Oracle Endeca Information Discovery (OEID) 3.1 is a major release that incorporates significant new self-service discovery capabilities for business users. These include agile data mashup, extended support for unstructured analytics, and an even tighter integration with Oracle BI This release is available for download from: Oracle Delivery Cloud Oracle Technology Network Some of the what's new highlights ... Self-service data mashup... enables access to a wider variety of personal and trusted enterprise data sources. Blend multiple data sets in a single app. Agile discovery dashboards... allows users to easily create, configure, and securely share discovery dashboards with intelligent defaults, intuitive wizards and drag-and-drop configuration. Deeper unstructured analysis ... enables users to enrich text using term extraction and whitelist tagging while the data is live. Enhanced integration with OBI... provides easier wizards for data selection and enables OBI Server as a self-service data source. Enterprise-class data discovery... offers faster performance, a trusted data connection library, improved auditing and increased data connectivity for Hadoop, web content and Oracle Data Integrator. Find out more ... visit the OEID Overview page to download the What's New and related Data Sheet PDF documents. Have questions or want to share details for Oracle Endeca Information Discovery?  The MOS Communities is a great first stop to visit and you can stop-by at MOS OEID Community.

    Read the article

  • Kubuntu 12.04 - DNS Issues

    - by AndrewJesaitis
    Starting yesterday (6/11/12), I've been having many network problems. When requesting a page in chrome, the page hangs on "Sending request" and then will eventually timeout. I'm within a VPN that has it's own DNS server. I've tried to manually set my DNS through the Network-Manager applet and by editing /etc/network/interfaces. Having no luck I unlinked the resolv.conf file and dumped the contents of my old resolv.conf into it. Again having no luck, I deactivated the dnsmasq server in /etc/NetworkManager/NetworkManager.conf by commenting out the dns=dnsmasq. $ cat NetworkManager.conf [main] plugins=ifupdown,keyfile #dns=dnsmasq no-auto-default=D0:67:E5:EA:B6:6B, [ifupdown] managed=false $ nm-tool NetworkManager Tool State: connected (global) - Device: eth0 [Wired connection 1] ------------------------------------------- Type: Wired Driver: tg3 State: connected Default: yes HW Address: D0:67:E5:EA:B6:6B Capabilities: Carrier Detect: yes Speed: 1000 Mb/s Wired Properties Carrier: on IPv4 Settings: Address: 192.168.254.122 Prefix: 24 (255.255.255.0) Gateway: 192.168.254.2 DNS: 192.168.254.1 What is strange is that the network will work fine for a few minutes then start to timeout. A few minutes later it will work again. I'm unable to hit internal or external sites when it is timing out. When I $dig local sites, I receive no answer. I do receive an answer from google.com. At this point, I would usually blame the DNS Server, especially since when I change to Google's DNS server things work. But, I need to use our internal DNS to hit our internal sites. Nobody else is having issues and they are all using DHCP. This group includes one user who is using 11.04. At this point, I'm at a loss for what to do, so any help would be appreciated.

    Read the article

  • Correct configuration of multiple Analytics trackers per page, spanning domains and subdomains

    - by Eliot Shepard
    My company publishes sites on a somewhat convoluted domain structure, and we're having trouble getting accurate numbers in Analytics when we have multiple trackers on the page. We publish under two brands (A, B). Each brand has a "national" site at A.com, B.com, as well as per-city "local" sites at eg. ny.A.com, la.A.com, sf.A.com, etc. Right now we're trying to track in these dimensions: Full network (A.com, ny.A.com, B.com, la.B.com, etc.) All sites in brand (A.com, ny.A.com, la.A.com, etc.) Inidividual site (ny.A.com) Here are the commands we're using on an individual site: _gaq.push( ['t0._setAccount', 'UA-XXXXXX-1'], // full network ['t0._setDomainName', 'none'], ['t0._setAllowLinker', true], ['t0._trackPageview'], ['t1._trackPageLoadTime'], ['t1._setAccount', 'UA-XXXXXX-2'], // brand ['t1._setDomainName', 'none'], ['t1._setAllowLinker', true], ['t1._trackPageview'], ['t1._trackPageLoadTime'], ['t2._setAccount', 'UA-XXXXXX-3'], // individual ['t2._setDomainName', 'none'], ['t2._setAllowLinker', true], ['t2._trackPageview'], ['t2._trackPageLoadTime'] ); We send the same commands to each account because we've had strange results when trackers were configured differently in the past. However, right now we're seeing inflated numbers for uniques on all three trackers. What is the correct way to configure this setup? Thanks for your time.

    Read the article

  • Should a model binder populate all of the model?

    - by Richard
    Should a model binder populate all of the model, or only the bits that are being posted? For example, I am adding a product in my system and on the form i want the user to select which sites the new product will appear on. Therefore, in my model I want to populate a collection called "AllAvailableSites" to render the checkboxes for the user to choose from. I also need to populate the model with any chosen sites on a post in case the form does not validate, and I need to represent the form showing the initial selections. It would seem that I should let the model binder set the chosen sites on the model, and (once in the controller method) I set the "AllAvailableSites" on the model. Does that sound right? It seems more efficient to set everything in the model binder but someone is suggesting it is not quite right. I am grateful for any advice; I have to say that all the MVC model binding help online seems to cite really simple examples, nothing complicated. Do I really need a GET and a POST version of a method? Can't they just take the same view model? Then I check in my model binder if it is a GET/POST, and populate all the model accordingly.

    Read the article

  • Need help to make a decision in career switch over? [closed]

    - by Fero
    I am a Software Engineer having 4 Years of experinece in web development using PHP, Drupal, MySql, Ajax and client site technologies like javascript, jquery,html and more. I have decided two platforms to switch over my career. SAP-ABAP (Because ABAP is related to coding) SALES FORCE One and only reason is that I am not getting good pack for the technologies what I am working with. Even top level companies are not ready to pay for this technologies. (And I am not expecting more.) To be honest I am good at technical and HR interviews too. So, I started to make an analysis of highly payable platforms and I got these two. SAP and Salesforce (Probabilty of On-site opportunity is also very high on both) Here my questions are: I am totally new to the above mentioned technologies. Which will be best suit for me ? Having basic ideas of the platforms what I have decided - But I am confused to choose I am having Good Coding experiencein PHP, Drupal as well as good experience in MySql. Having very good experience in creating sites related to E-Commerce, LMS, Q&A sites, Travel Sites, Blogs, Social networking site and more. Which I can learn easily or for which I can get good documentations online Kindly understand that I am not creating a debate over here. I hope Professionals over here can Show me the correct path.... I am waiting to travel on that...

    Read the article

  • Wordpress Multisite and Google Analytics in subfolders with mapped domains

    - by David
    I have a wordpress multisite with sub folders. The site's subfolders are mapped to domains, which are set to primary. I'm using the 'Google Analytics Multisite Async' code to track things. From what I can see it's tracking the sites fine (getting page hits for each site in google analytics) baring the original site in the Multisite which in content overview lists domains then the amount of traffic it's getting along with the orginal domains traffic. I don't want to track any other traffic for my orginal site than what goes to that. i.e. I don't want it tracking my other sites in multi-site. e.g. domain1.com is my orginal and I have lots of other sites in the multisite lets say domain2.com, domain3.com. In content overview in Analytics it's listing say domain2.com as content. Can I tell it to filter these out some how either in Analytics or within WordPress? Hopefully explained that clearly!

    Read the article

  • guidline for promoting a web forum or portal

    - by Hafiz
    I am a web application developer, have developed a lot of sites, portal and apps for clients. Now I want to have own such sites that with which I can do business. But I don't know what are steps to do so. What I know is make a site or portal. But what after that? There are lot of people having so much traffic on sites built with some simple open source. Many of them are forums. I also wanted to start a forum and a web portal. I can develop that or can have open source. But what after that? Content entry and SEO? Is it all to promote a portal or site? Do SEO nowadays work ? or it is all about marketing and advertising? I have no idea about that so please tell what you guys suggest. thanks for every one's opinion in advance.

    Read the article

  • Price Drop for Processor based License on Exalytics

    - by Mike.Hallett(at)Oracle-BI&EPM
    ·       33% reduction in the list `per processor` license pricing for the Oracle BI Foundation Suite ·       New capacity-based licensing which allows customers to think big & start small, significantly lowering the entry price point for an Exalytics. Oracle BI Software List Price changes In response to new powerful platforms like the in-memory Oracle Exalytics with 40 cpu cores (counted under Oracle pricing policy as 20 “processors”), the list price of “Oracle BI Foundation Suite” (BIFS) is reduced by 33% from $450K per processor to $300K per processor. Capacity-based licensing on Exalytics (Trusted Partitions) “Capacity-based pricing” for the BIFS, Endeca, Essbase and Times Ten for Exalytics software is now available for Exalytics systems. This is delivered using “Oracle VM” (OVM).  We still ship a full Exalytics machine to all customers, but they may choose to only use and license a subset of the processors installed in the machine.   Customers can license Exalytics software in units of 5 “processors”: 5, 10, 15 or the full capacity 20.   As the customer’s implementation and workload increases, it is a simple matter to license additional processors and, using OVM, make them available to the BI or EPM application. Endeca Information Discovery now available on Exalytics Oracle has also announced the certification of “Oracle Endeca Information Discovery” (EID) on the Exalytics machine.    EID can be licensed alone or in combination with the BIFS & Times Ten for an Exalytics stack, and also participates in the capacity based pricing outlined above.   The Exalytics hardware is the perfect platform for EID, and provides superb power and performance for this in-memory hybrid text-search-analytics.   For more information : Oracle Price lists Oracle Partitioning Policy Discussion by Mark Rittman (Rittman Mead Consulting ltd.) on Oracle Trusted Partitions for Oracle Engineered Systems, Oracle Exalytics and Updated BI Foundation Pricing.

    Read the article

  • Are you ready to take a walk in the clouds?

    - by Steve Loethen
    Cloud computing is here, whether we want it or not.  When I say "a walk in the clouds” I am not talking about a pleasant romantic comedy, but a real alternative to hosting applications on-premise.  For years we have had the power to host our web sites on remote systems.  Sure, challenges existed.  Mostly web sites.  I could, with a few clicks, create a account at a myriad of web host sites, put my site in the hands of a remote hosting company, and boom, I was a site on the internet.  But choices, power, and management was limited. Now, we have a set of services to let us approach and power and control we love, but with scalability of the data center.  My personal web site is hosted on a laptop running hyperV in my basement.  I have to manage the machine, patch it, make sure it is powered up.  This is fine for the “hello, this is my dog skippy site” that I maintain. If the football pool I run has an issue, one of the 10 users I have calls or emails me and I go check it out.  All is well. But this falls well below the needs of even the simplest of enterprises.  A business needs a stronger datacenter, a better pipe to the world.  Do I really want to base my business on a dynamic dns and a dsl line from the local phone company? Cloud computing gives us most of what I value (control, a db of my own, updating my site from Visual Studio). Come learn how this technology can transform your business.  If you are a Microsoft shop, or are interested in Microsoft in the cloud, on April 8 and 9, a 2 day free Azure training class is being conducted in Kansas City.  http://www.azurebootcamp.com/city/kansascity Hope to see you there.  If you come, make sure you look me up.

    Read the article

  • Unified data source for k2 installed Joomla websites

    - by Özkan ÖZLÜ
    I am responsible for a few web sites of my organization. I use Joomla! 2.5.9 for those web sites. They all are running at the same server. I use K2 component for content managing. I have a general website in which shows all the staff information at the 'Staff' page. Also some of those people and their contents are shown in another department's website. So, there are databases for each web site. For example: In the general website (let's say general.org), when I click on the 'Staff' menu item, page shows all of the people work at my organization. Also they work at different departments. In another web site (eg: education.general.org) when I click on the 'Staff' menu item, it shows the people work at education department. But for each web site, I have different user accounts which means a modification in one of them does not affect the other one. If the one of the education staff tries to change his profile picture on the education web site, he also has to do it on the general web site. And sometimes one person might be working at two departments. Thus he has to edit three times of his data. Is it possible to merge the records for all websites? In other words, I want everyone to insert/update their data on the general web site, and the other web sites will be updated automatically.

    Read the article

  • What are some internet trends that you've noticed over the past ~10 years? [closed]

    - by Michael
    I'll give an example of one that I've noticed: the number of web sites that ask for your email address (GOOG ID, YAHOO! ID, etc.) has skyrocketed. I can come up with no legitimate reason for this other than (1) password reset [other ways to do this], or (2) to remind you that you have an account there, based upon the time of your last visit. Why does a web site need to know your email address (Google ID, etc.) if all you want to do is... download a file (no legit reason whatsoever) play a game (no legit reason whatsoever) take an IQ test or search a database (no legit reason whatsoever) watch a video or view a picture (no legit reason whatsoever) read a forum (no legit reason whatsoever) post on a forum (mildly legit reason: password reset) newsletter (only difference between a newsletter and a blog is that you're more likely to forget about the web site than you are to forget about your email address -- the majority of web sites do not send out newsletters, however, so this can't be the justification) post twitter messages or other instant messaging (mildly legit reason: password reset) buy something (mildly legit reasons: password reset + giving you a copy of a receipt that they can't delete, as receipts stored on their server can be deleted) On the other hand, I can think of plenty of very shady reasons for asking for this information: so the NSA, CIA, FBI, etc. can very easily track what you do by reading your email or asking GOOG, etc. what sites you used your GOOG ID at to use the password that you provide for your account in order to get into your email account (most people use the same password for all of their accounts), find all of your other accounts in your inbox, and then get into all of those accounts sell your email address to spammers These reasons, I believe, are why you are constantly asked to provide your email address. I can come up with no other explanations whatsoever. Question 1: Can anyone think of any legitimate or illegitimate reasons for asking for someone's email address? Question 2: What are some other interesting internet trends of the past ~10 years?

    Read the article

  • How to make the internal subwoofer work on an Asus G73JW?

    - by CodyLoco
    I have an Asus G73JW laptop which has an internal subwoofer built-in. Currently, the system detects the internal speakers as a 2.0 system (or I can change do 4.0 is the only other option). I found a bug report here: https://bugs.launchpad.net/ubuntu/+source/alsa-driver/+bug/673051 which discusses the bug and according to them a fix was sent upstream back at the end of 2010. I would have thought this would have made it into 12.04 but I guess not? I tried following the link given at the very bottom to install the latest ALSA drivers, here: https://wiki.ubuntu.com/Audio/InstallingLinuxAlsaDriverModules however I keep running into an error when trying to install: sudo apt-get install linux-alsa-driver-modules-$(uname -r) Reading package lists... Done Building dependency tree Reading state information... Done E: Unable to locate package linux-alsa-driver-modules-3.2.0-24-generic E: Couldn't find any package by regex 'linux-alsa-driver-modules-3.2.0-24-generic' I believe I have added the repository correctly: sudo add-apt-repository ppa:ubuntu-audio-dev/ppa [sudo] password for codyloco: You are about to add the following PPA to your system: This PPA will be used to provide testing versions of packages for supported Ubuntu releases. More info: https://launchpad.net/~ubuntu-audio-dev/+archive/ppa Press [ENTER] to continue or ctrl-c to cancel adding it Executing: gpg --ignore-time-conflict --no-options --no-default-keyring --secret-keyring /tmp/tmp.7apgZoNrqK --trustdb-name /etc/apt/trustdb.gpg --keyring /etc/apt/trusted.gpg --primary-keyring /etc/apt/trusted.gpg --keyserver hkp://keyserver.ubuntu.com:80/ --recv 4E9F485BF943EF0EABA10B5BD225991A72B194E5 gpg: requesting key 72B194E5 from hkp server keyserver.ubuntu.com gpg: key 72B194E5: public key "Launchpad Ubuntu Audio Dev team PPA" imported gpg: Total number processed: 1 gpg: imported: 1 (RSA: 1) And I also ran an update as well (followed the instructions on the fix above). Any ideas?

    Read the article

  • News Applications internal working [on hold]

    - by Vijay
    How does news applications work other than RSS Feed based applications? I know some of them take the RSS content from the source site.But sometimes I see, those applications show - Title Description Date Image video etc. Even though when I see the original site's rss, image, video is not there in rss. So how does one get that to show in there applications? Some applications even shows feeds from magazine sites, newspaper sites. How do these applications work? I am creating an application which will link to different news sites feeds categorized (like top news, technology, games, articles etc.) On the front page it will show the website names, then on selection of any news site it will get the feed from that website and show it to user. So I would like to know All the fetching of data from should be done on user selection or data should be prefetched? Detailed information I want to fetch from the original like provided in the rss data. How should I go about it?

    Read the article

  • Not able to install LTT tool in 10.04

    - by Ashoka
    I have tried the below commands to install LTT on VMware running Ubuntu 10.04, but got the following error. Please help. From link : https://launchpad.net/~lttng/+archive/ppa $ sudo apt-add-repository ppa:lttng/ppa $ sudo apt-get update $ sudo apt-get install lttng-tools lttng-modules-dkms babeltrace user@usr:~$ sudo apt-add-repository ppa:lttng/ppa Executing: gpg --ignore-time-conflict --no-options --no-default-keyring --secret-keyring /etc/apt/secring.gpg --trustdb-name /etc/apt/trustdb.gpg --keyring /etc/apt/trusted.gpg --primary-keyring /etc/apt/trusted.gpg --keyserver keyserver.ubuntu.com --recv C541B13BD43FA44A287E4161F4A7DFFC33739778 gpg: requesting key 33739778 from hkp server keyserver.ubuntu.com gpg: unable to execute program `/usr/local/libexec/gnupg/gpgkeys_curl': No such file or directory gpg: no handler for keyserver scheme `hkp' gpg: keyserver receive failed: keyserver error user@usr:~$ user@usr:~$ sudo apt-get update ..... $ user@usr:~$ sudo apt-get install lttng-tools lttng-modules-dkms babeltrace Reading package lists... Done Building dependency tree Reading state information... Done E: Couldn't find package lttng-tools user@usr:~$ My systemm details: user@usr:~$ cat /etc/lsb-release DISTRIB_ID=Ubuntu DISTRIB_RELEASE=10.04 DISTRIB_CODENAME=lucid DISTRIB_DESCRIPTION="Ubuntu 10.04.4 LTS" user@usr:~$ user@usr:~$ user@usr:~$ uname -a Linux usr 2.6.32-42-generic #95-Ubuntu SMP Wed Jul 25 15:57:54 UTC 2012 i686 GNU/Linux user@usr:~$ Regards, Ashoka

    Read the article

  • Why does Sharepoint 2010 Web Reference work, but Service Reference does not

    - by Darien Ford
    Sharepoint is setup to use NTLM authentication. When I reference http://myserver/Sites/Ops/_vti_bin/Lists.asmx?WSDL as a Web Reference, I can call the methods and get valid responses. When I reference the same url as a Service Reference, the server throws an exception when calling methods. My account is admin on the Sharepoint Farm. This is the app.config for the service reference (mostly auto generated): <?xml version="1.0" encoding="utf-8" ?> <configuration> <configSections> </configSections> <system.serviceModel> <bindings> <basicHttpBinding> <binding name="ListsSoap" closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00" allowCookies="false" bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard" maxBufferSize="65536" maxBufferPoolSize="524288" maxReceivedMessageSize="65536" messageEncoding="Text" textEncoding="utf-8" transferMode="Buffered" useDefaultWebProxy="true"> <readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384" maxBytesPerRead="4096" maxNameTableCharCount="16384" /> <security mode="TransportCredentialOnly"> <transport clientCredentialType="Ntlm" /> </security> </binding> </basicHttpBinding> </bindings> <client> <endpoint address="http://myserver/Sites/Ops/_vti_bin/Lists.asmx" binding="basicHttpBinding" bindingConfiguration="ListsSoap" contract="SharepointLists.ListsSoap" name="ListsSoap" /> </client> </system.serviceModel> </configuration> Saddly, the only information the exception provides is this: "Exception of type 'Microsoft.SharePoint.SoapServer.SoapServerException' was thrown." No other details. The code that I'm using is: public ListClass() { _Client = new SharepointLists.ListsSoapClient(); } public System.Xml.Linq.XElement GetTaskList() { return _Client.GetList("Tasks"); } Any thoughts? I would like to use the Service Reference rather than the Web Reference. UPDATE: I tried Rob's suggestion and got this error: HTTP GET Error URI: http://myserver/Sites/Ops/_vti_bin/Lists.asmx The document at the url http://myserver/Sites/Ops/_vti_bin/Lists.asmx was not recognized as a known document type. The error message from each known type may help you fix the problem: - Report from 'http://myserver/Sites/Ops/_vti_bin/Lists.asmx' is 'The document format is not recognized (the content type is 'text/html; charset=utf-8').'. - Report from 'DISCO Document' is 'There was an error downloading 'http://myserver/_vti_bin/Lists.asmx?disco'.'. - The request failed with HTTP status 404: Not Found. - Report from 'WSDL Document' is 'The document format is not recognized (the con tent type is 'text/html; charset=utf-8').'. - Report from 'XML Schema' is 'The document format is not recognized (the conten t type is 'text/html; charset=utf-8').'.

    Read the article

< Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >