Search Results

Search found 13776 results on 552 pages for 'password reset'.

Page 380/552 | < Previous Page | 376 377 378 379 380 381 382 383 384 385 386 387  | Next Page >

  • Tab Sweep: FacesMessage enhancements, Look up thread pool resources, JQuery/JSF integration, Galleria, ...

    - by arungupta
    Recent Tips and News on Java, Java EE 6, GlassFish & more : • Fixing remote GlassFish server errors on NetBeans (Igor Cardoso) • FacesMessage Enhancements (PrimeFaces) • How to create and look up thread pool resource in GlassFish (javahowto) • Jersey 1.12 is released (Jakub Podlesak) • VisualVM problem connecting to monitor Glassfish (Raymond Reid) • JSF 2.0 JQuery-JSF Integration (John Yeary) • JDBC-ODBC Bridge Example (John Yeary) • The Java EE 6 Example - Gracefully dealing with Errors in Galleria - Part 6 (Markus Eisele) • Logout functionality in Java web applications (JavaOnly) • LDAP PASSWORD POLICIES AND JAVAEE (Ricky's Hodgepodge) • Java User Groups Promote Java Education (java.net Editor's Daily Blog) • JavaEE Revisits Design Patterns: Aspects (Interceptor) (Developer Chronicles) • Java EE 6 Hand-on Workshop @ IIUI (Shahzad Badar) • javaee6-crud-example (Arjan Tims) • Sample CRUD application with JSF and RichFaces (Mark van der Tol) • 5 useful methods JSF developers should know (Java Code Geeks) Here are some tweets from this week ... Almost 9000 Parleys views at the #JavaEE6 #Devoxx talk I did with @BertErtman. Not even made available for free yet! #JavaEE6 is hot :-) Sent three proposals for Øredev, about #JavaEE6, #OSGi and a case study about Leren-op-Maat (OSGi in the cloud) together with @m4rr5 [blog] The Java EE 6 #Example - Gracefully dealing with #Errors in #Galleria - Part 6 http://t.co/Drg1EQvf #javaee6 Tomorrow, there is a session about Java EE6 #javaee6 at islamia university #bahawalpur under #pakijug.about 150 students going to attend it.

    Read the article

  • Installing SharePoint 2010 and PowerPivot for SharePoint on Windows 7

    - by smisner
    Many people like me want (or need) to do their business intelligence development work on a laptop. As someone who frequently speaks at various events or teaches classes on all subjects related to the Microsoft business intelligence stack, I need a way to run multiple server products on my laptop with reasonable performance. Once upon a time, that requirement meant only that I had to load the current version of SQL Server and the client tools of choice. In today's post, I'll review my latest experience with trying to make the newly released Microsoft BI products work with a Windows 7 operating system. The entrance of Microsoft Office SharePoint Server 2007 into the BI stack complicated matters and I started using Virtual Server to establish a "suitable" environment. As part of the team that delivered a lot of education as part of the Yukon pre-launch activities (that would be SQL Server 2005 for the uninitiated), I was working with four - yes, four - virtual servers. That was a pretty brutal workload for a 2GB laptop, which worked if I was very, very careful. It could also be a finicky and unreliable configuration as I learned to my dismay at one TechEd session several years ago when I had to reboot a very carefully cached set of servers just minutes before my session started. Although it worked, it came back to life very, very slowly much to the displeasure of the audience. They couldn't possibly have been less pleased than me. At that moment, I resolved to get the beefiest environment I could afford and consolidate to a single virtual server. Enter the 4GB 64-bit laptop to preserve my sanity and my livelihood. Likewise, for SQL Server 2008, I managed to keep everything within a single virtual server and I could function reasonably well with this approach. Now we have SQL Server 2008 R2 plus Office SharePoint Server 2010. That means a 64-bit operating system. Period. That means no more Virtual Server. That means I must use Hyper-V or another alternative. I've heard alternatives exist, but my few dabbles in this area did not yield positive results. It might have been just me having issues rather than any failure of those technologies to adequately support the requirements. My first run at working with the new BI stack configuration was to set up a 64-bit 4GB laptop with a dual-boot to run Windows Server 2008 R2 with Hyper-V. However, I was generally not happy with running Windows Server 2008 R2 on my laptop. For one, I couldn't put it into sleep mode, which is helpful if I want to prepare for a presentation beforehand and then walk to the podium without the need to hold my laptop in its open state along the way (my strategy at the TechEd session long, long ago). Secondly, it was finicky with projectors. I had issues from time to time and while I always eventually got it to work, I didn't appreciate those nerve-wracking moments wondering whether this would be the time that it wouldn't work. Somewhere along the way, I learned that it was possible to load SharePoint 2010 in a Windows 7 which piqued my interest. I had just acquired a new laptop running Windows 7 64-bit, and thought surely running the BI stack natively on my laptop must be better than running Hyper-V. (I have not tried booting to Hyper-V VHD yet, but that's on my list of things to try so the jury of one is still out on this approach.) Recently, I had to build up a server with the RTM versions of SQL Server 2008 R2 and Sharepoint Server 2010 and decided to follow suit on my Windows 7 Ultimate 64-bit laptop. The process is slightly different, but I'm happy to report that it IS possible, although I had some fits and starts along the way. DISCLAIMER: These products are NOT intended to be run in production mode on the Windows 7 operating system. The configuration described in this post is strictly for development or learning purposes and not supported by Microsoft. If you have trouble, you will NOT get help from them. I might be able to help, but I provide no guarantees of my ability or availablity to help. I won't provide the step-by-step instructions in this post as there are other resources that provide these details, but I will provide an overview of my approach, point you to the relevant resources, describe some of the problems I encountered, and explain how I addressed those problems to achieve my desired goal. Because my goal was not simply to set up SharePoint Server 2010 on my laptop, but specifically PowerPivot for SharePoint, I started out by referring to the installation instructions at the PowerPiovt-Info site, but mainly to confirm that I was performing steps in the proper sequence. I didn't perform the steps in Part 1 because those steps are applicable only to a server operating system which I am not running on my laptop. Then, the instructions in Part 2, won't work exactly as written for the same reason. Instead, I followed the instructions on MSDN, Setting Up the Development Environment for SharePoint 2010 on Windows Vista, Windows 7, and Windows Server 2008. In general, I found the following differences in installation steps from the steps at PowerPivot-Info: You must copy the SharePoint installation media to the local drive so that you can edit the config.xml to allow installation on a Windows client. You also have to manually install the prerequisites. The instructions provides links to each item that you must manually install and provides a command-line instruction to execute which enables required Windows features. I will digress for a moment to save you some grief in the sequence of steps to perform. I discovered later that a missing step in the MSDN instructions is to install the November CTP Reporting Services add-in for SharePoint. When I went to test my SharePoint site (I believe I tested after I had a successful PowerPivot installation), I ran into the following error: Could not load file or assembly 'RSSharePointSoapProxy, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. I was rather surprised that Reporting Services was required. Then I found an article by Alan le Marquand, Working Together: SQL Server 2008 R2 Reporting Services Integration in SharePoint 2010,that instructed readers to install the November add-in. My first reaction was, "Really?!?" But I confirmed it in another TechNet article on hardware and software requirements for SharePoint Server 2010. It doesn't refer explicitly to the November CTP but following the link took me there. (Interestingly, I retested today and there's no longer any reference to the November CTP. Here's the link to download the latest and greatest Reporting Services Add-in for SharePoint Technologies 2010.) You don't need to download the add-in anymore if you're doing a regular server-based installation of SharePoint because it installs as part of the prerequisites automatically. When it was time to start the installation of SharePoint, I deviated from the MSDN instructions and from the PowerPivot-Info instructions: On the Choose the installation you want page of the installation wizard, I chose Server Farm. On the Server Type page, I chose Complete. At the end of the installation, I did not run the configuration wizard. Returning to the PowerPivot-Info instructions, I tried to follow the instructions in Part 3 which describe installing SQL Server 2008 R2 with the PowerPivot option. These instructions tell you to choose the New Server option on the Setup Role page where you add PowerPivot for SharePoint. However, I ran into problems with this approach and got installation errors at the end. It wasn't until much later as I was investigating an error that I encountered Dave Wickert's post that installing PowerPivot for SharePoint on Windows 7 is unsupported. Uh oh. But he did want to hear about it if anyone succeeded, so I decided to take the plunge. Perseverance paid off, and I can happily inform Dave that it does work so far. I haven't tested absolutely everything with PowerPivot for SharePoint but have successfully deployed a workbook and viewed the PowerPivot Management Dashboard. I have not yet tested the data refresh feature, but I have installed. Continue reading to see how I accomplished my objective. I unintalled SQL Server 2008 R2 and started again. I had different problems which I don't recollect now. However, I uninstalled again and approached installation from a different angle and my next attempt succeeded. The downside of this approach is that you must do all of the things yourself that are done automatically when you install PowerPivot as a new server. Here are the steps that I followed: Install SQL Server 2008 R2 to get a database engine instance installed. Run the SharePoint configuration wizard to set up the SharePoint databases. In Central Administration, create a Web application using classic mode authentication as per a TechNet article on PowerPivot Authentication and Authorization. Then I followed the steps I found at How to: Install PowerPivot for SharePoint on an Existing SharePoint Server. Especially important to note - you must launch setup by using Run as administrator. I did not have to manually deploy the PowerPivot solution as the instructions specify, but it's good to know about this step because it tells you where to look in Central Administration to confirm a successful deployment. I did spot some incorrect steps in the instructions (at the time of this writing) in How To: Configure Stored Credentials for PowerPivot Data Refresh. Specifically, in the section entitled Step 1: Create a target application and set the credentials, both steps 10 and 12 are incorrect. They tell you to provide an actual Windows user name and password on the page where you are simply defining the prompts for your application in the Secure Store Service. To add the Windows user name and password that you want to associate with the application - after you have successfully created the target application - you select the target application and then click Set credentials in the ribbon. Lastly, I followed the instructions at How to: Install Office Data Connectivity Components on a PowerPivot server. However, I have yet to test this in my current environment. I did have several stops and starts throughout this process and edited those out to spare you from reading non-essential information. I believe the explanation I have provided here accurately reflect the steps I followed to produce a working configuration. If you follow these steps and get a different result, please let me know so that together we can work through the issue and correct these instructions. I'm sure there are many other folks in the Microsoft BI community that will appreciate the ability to set up the BI stack in a Windows 7 environment for development or learning purposes. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Introducing the Oracle Parcel Service&ndash;Example/Reference Application

    - by Jeffrey West
    Over the last few weeks the product management team has been working on a webcast series that is airing in EMEA.  It is a 5-episode series where we talk about different features of WebLogic and show how to build applications that take advantage of these features.  Each session is focused at a different layer of the technology stack, and you can find the schedule below. The application we are building in this series is named the ‘Oracle Parcel Service’.  It is an example application and not a product of Oracle by any stretch of the imagination.  Over the next few weeks we will be finalizing the code and will be releasing it for you to check out.  For updates, request membership to the Oracle Parcel Service project on SampleCode.oracle.com: https://www.samplecode.oracle.com/sf/projects/oracle-parcel-svc/. Here are some of the key features that we are highlighting: JPA 2.0 (new in WebLogic 10.3.4) with EclipseLink Coherence TopLink Grid Level 2 cache for JPA JAX-RS (new in WebLogic 10.3.4) 1.0 for RESTful services Lightweight JQuery Web UI for consuming RESTful services JSF 2.0 (new in WebLogic 10.3.4) utilizing PrimeFaces EJB 3.0 Spring-WS Web Services JAX-WS Web Services Spring MDP’s for Event Driven Architectures Java MDB’s for Event Driven Architectures Partitioned Distributed Topics for Event Driven Architectures   Accessing the Code on SampleCode.Oracle.com You will need to log in using your Oracle.com username and password.  If you have not created an account, you will need to do so.  It’s a simple one-page form and we don’t bother you with too many emails.   Please join the project to be kept up to date on changes to the code and new projects.  Joining the project is not required, but very much appreciated. Once you have signed in you should see an icon for accessing the Source Code via Subversion.  You can also download a zip file containing the code.

    Read the article

  • Public EC Meeting scheduled for 20 November

    - by Heather VanCura
    The minutes and materials from the October 2012 JCP EC Teleconference are now available.  The next JCP EC Meeting, and the first EC Meeting under JCP 2.9, with the Merged EC, is scheduled for 20 November.  The second hour of this meeting will be open to the public at 3:00 PM PST. The agenda includes  JSR 355,  EC merge implementation report, JSR 358 (JCP.next.3) status report, JCP 2.8 status update and community audit program.  Details are below. We hope you will join us, but if you cannot attend, not to worry--the recording and materials will also be public on the JCP.org multimedia page. Meeting details Date & Time Tuesday November 20, 2012, 3:00 - 4:00 pm PST Location Teleconference Dial-in +1 (866) 682-4770 (US) Conference code: 627-9803 Security code: 52732 ("JCPEC" on your phone handset) For global access numbers see http://www.intercall.com/oracle/access_numbers.htm Or +1 (408) 774-4073 WebEx Browse for the meeting from https://jcp.webex.com No registration required (enter your name and email address) Password: JCPEC Agenda JSR 355 (the EC merge) implementation report JSR 358 (JCP.next.3) status report 2.8 status update and community audit program Discussion/Q&A Note The call will be recorded and the recording published on jcp.org, so those who are unable to join in real-time will still be able to participate.

    Read the article

  • Understanding HTTP Cookies in Indy 10 for Delphi XE2

    - by Jerry Dodge
    I have been working with Indy 10 HTTP Servers / Clients lately in Delphi XE2, and I need to make sure I'm understanding session management correctly. In the server, I have a "bucket" of sessions, which is a list of objects which each represent a unique session. I don't use username and password to authenticate users, but I rather use a unique API key which is issued to a client, and has an expiration. When a client wishes to connect to the server, it first logs in by calling the "login" command, which is a path like this: http://localhost:1234/login?APIKey=abcdefghij. The server checks this API Key against the database, and if it's valid, it creates a new session in the bucket, issues a new cookie (unique string), and sets the response cookies with Success=Y and Cookie=abcdefghij. This is where I have the question. Assuming the client end has its own method of cookie management, the client will receive this login response back from the server and automatically save the cookies as necessary. Any future request from the client to the server shall automatically send along these cookies, and the client side doesn't have to necessarily worry about setting these cookies when sending requests to the server. Right? PS - I'm asking this question here on programmers.stackexchange.com because I didn't see it fit to ask on stackoverflow.com. If anyone thinks this is appropriate enough for stackoverflow.com, please let me know.

    Read the article

  • New ACS Resell Portfolio for OPN Members

    - by rituchhibber
    Oracle Advanced Customer Support (ACS) Services is pleased to announce availability of the ACS Resell Portfolio to Oracle PartnerNetwork (OPN) members on June 28, 2012. The ACS Resell Portfolio is available to Gold level OPN members and above selling to end users with valid Oracle Premier Support/End User agreements, and in countries where ACS has a local in-country presence to support the partner business. ACS provides mission critical support services for complex IT environments to help maximize performance, achieve higher availability, and reduce risk. The ACS Resell Portfolio can be leveraged to reduce time to market and drive improved end user satisfaction. Including ACS services at point of license sale can maximize your success as an Oracle partner.       On July 10, 2012, Oracle ACS is hosting a 60-minute resell portfolio training session. Topics include: ACS Resell Portfolio objectives   Partner participation requirements ACS portfolio services enabled for partner resell ACS sales engagement and transaction processes Contracting requirements Attend the following session to hear how you can maximize your profit opportunities by including ACS services, which compliment your solutions with integrated Oracle advanced support technologies.      DIAL-IN INFORMATION Webconference July 10, 2012 4:00 PM CEST Webconference Session Number: 591 988 820 Session Password: ebh12345 International: 706.501.7506 US: 866.589.6202 Call ID: 95867658 Click here for a list of toll-free international numbers. Please contact [email protected] with any questions or visit the ACS website.

    Read the article

  • Host your own private git repository via SSH

    - by kerry
    If you are like me you have tons of projects you would like to keep private but track with git, but do not want to pay a git host for a private plan. One of the problems is that most hosts scale their plans by project instead of users. Luckily, it is easy to host your own git repositories on any el cheapo host that provides ssh access. In the interest of full disclosure, I learned this trick from this blog post. I decided to recreate it in case the source material vanishes for some reason. To setup your host, login via ssh and run the following commands: mkdir ~/git/yourprojectname.git cd ~/git/yourprojectname.git git --bare init Then in your project directory (on your local machine): # setup your user info git config --global user.name "Firstname Lastname" git config --global user.email "[email protected]" # initialize the workspace git init git add . git commit -m "initial commit" git add remote origin ssh://[email protected]/~/git/yourprojectname.git git push origin master It’s that easy! To keep from entering your password every time add your public key to the server: Generate your key with ‘ssh-keygen -t rsa‘ on your local machine.  Then add the contents of the generated file to ~/.ssh/authorized_keys on your server.

    Read the article

  • DENY select on sys.dm_db_index_physical_stats

    - by steveh99999
    Technorati Tags: security,DMV,permission,sys.dm_db_index_physical_stats I recently saw an interesting blog article by Paul Randal about the performance overhead of querying the sys.dm_db_index_physical_stats. So I was thinking, would it be possible to let non-sysadmin users query DMVs on a SQL server but stop them querying this I/O intensive DMV ? Yes it is, here’s how… 1. Create a new login for test purposes, with permissions to access AdventureWorks database only … CREATE LOGIN [test] WITH PASSWORD='xxxx', DEFAULT_DATABASE=[AdventureWorks] GO USE [AdventureWorks] GO CREATE USER [test] FOR LOGIN [test] WITH DEFAULT_SCHEMA=[dbo] GO 2.login as user test and issue command SELECT  * FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks'),NULL,NULL,NULL,'DETAILED') gets error :-  Msg 297, Level 16, State 12, Line 1 The user does not have permission to perform this action. 3.As a sysadmin, issue command :- USE AdventureWorks GRANT VIEW DATABASE STATE TO [test] or GRANT VIEW SERVER STATE TO [test] if all databases can be queried via DMV. 4. Try again as user test to issue command SELECT * FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks '),NULL,NULL,NULL,'DETAILED') -- now produces valid results from the DMV.. 5 now create the test user in master database, public role only USE master CREATE USER [test] FOR LOGIN [test] 6 issue command :- USE master DENY SELECT ON sys.dm_db_index_physical_stats TO [test] 7 Now go back to AdventureWorks using test login and try SELECT * FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks’),NULL,NULL,NULL,’DETAILED') Now gets error... Msg 229, Level 14, State 5, Line 1 The SELECT permission was denied on the object 'dm_db_index_physical_stats', database 'mssqlsystemresource', schema 'sys'. but the user is still able to query all other non-IO-intensive DMVs. If the user attempts to view the index physical stats via a builtin management studio report  – see recent blog post by Pinal Dave they get an error also

    Read the article

  • I am looking for a script where users can create groups cms/social interaction site

    - by Paul
    I am trying to find a Content Mangement/social interaction script that requires a person to be a part of a group. Specifically: My daughter is a cheerleader and there are a number of cheer groups she is involved in and also has friends in many others. A lot of them could use some kind of website where they can share information between their team members and coach. The coach being the controller of the group and who can join etc. "Group Leader"? One can only join the group if invited or given a password or some such security. There would be multipel groups or in this case multiple cheer squads who were registered as groups and the cheerleaders a part of their group. The coach or group leader would have control of the group calender and they may have their own calendars and be messaging between them and/or other social interactions. IN a perfect world they could modify their own pages individualy. Communication could go globally or only to the group and a "friends or buddy" system. I think you get the idea. I really like OCportal and what it does and can do but it does not have the group funcitionality I am looking for. Perhaps I am just going to need to see about getting aprogrammer to write an add on for me if ther is nothing like this out there. But if you know of any I would appreciate being pointed in that direction.

    Read the article

  • CodePlex Daily Summary for Monday, October 01, 2012

    CodePlex Daily Summary for Monday, October 01, 2012Popular ReleasesD3 Loot Tracker: 1.4.1: This version will automatically save a recording session on application exit if the user didn't stop the current session.CRM 2011 Visual Ribbon Editor: Visual Ribbon Editor (1.2.1001.1): Visual Ribbon Editor 1.2.1001.1 What's New: Ability to hide currently selected tab Password in the connection file is now being encrypted Esc and Enter keyboard keys are now supported in all dialogs Extended width of Prefix and Id fields in New Button and New Group dialogs Minor UI enchancements for Hide Button button Minor UI enchancements for View XML button and dialog File/assembly versioning for executable file Notes: Existing plain-text password will be automatically encrypt...Untangler: Untangler: First version of Untangler application available now.SubExtractor: Release 1029: Feature: Added option to make i and ¡ characters movie-specific for improved OCR on Spanish subs (Special Characters tab in Options) Feature: Allow switch to Word Spacing dialog directly from Spell Check dialog Fix: Added more default word spacings for accented characters Fix: Changed Word Spacing dialog to show all OCR'd characters in current sub Fix: Removed application focus grab during OCR Fix: Tightened HD subs fuzzy logic to reduce false matches in small characters Fix: Improved Arrow k...MCEBuddy 2.x: MCEBuddy 2.2.18: Reccomended download Changelog for 2.2.18 (32bit and 64bit) 1. Added support for checking if Showanalyzer has hung and cancelling it 2. New version of comskip, 0.81.48 3. Speeding up comskip 4. Fixed a build bug in 64bit 2.2.17 5. Added a new comkip.ini, better commercial detection for international channels and less aggressive. Old one has been retained as comskip_old.ini 6. Added support for Audio Offset on Conversion Task page in GUI (this overrides the profiles AudioDelay when specified)Mugen Injection: Mugen Injection 3.0: Added a generic version of the fluent syntax. Big changes in fluent-syntax. Added support to resolve a parameters: - Factory: AnyCustomFunc{T}, AnyCustomFunc{IEnumerable<IInjectionParameters>,T}, AnyCustomFunc{IDictionary<string, object>,T}, AnyCustomFunc{IEnumerable<IInjectionParameters>,IDictionary<string, object>,T} - Lazy: AnyCustomLazy<T> - Binding metadata: ISetting Change binding builders, now you can configure them through the string settings. Increase performance. F...Aggravation: Version 1.0: This version 1.0 release is pretty stable. You need the Silverlight 4 runtime, developer tools, and Experssion Blend 4 installed.Tube++: Tube++ 4.0.0.43: Fix parser for downloading and add new default slate gray styleReadable Passphrase Generator: KeePass Plugin 0.7.1: See the KeePass Plugin Step By Step Guide for instructions on how to install the plugin. Changes Built against KeePass 2.20Windows 8 Toolkit - Charts and More: Beta 1.0: The First Compiled Version of my LibraryCatchThatException: Release 1.0: This is the first relase for CatchThatException library at 29-9-2012CardPlay: a Solitaire Framework for .Net: 0.3.0: Added 5 games: Blockade, Chessboard, Colorado, Sly Fox, TwentyUltraFluid Modeling Suite: Beta 1: This release is experimental but contains a lot of features like: - xml serialization - multiple selection - clipboard copy/cut/paste on context menu There is the debug and the release (recommanded default) versions.PDF.NET: PDF.NET.Ver4.5-OpenSourceCode: PDF.NET Ver4.5 ????,????Web??????。 PDF.NET Ver4.5 Open Source Code,include a sample Web application project.Visual Studio Icon Patcher: Version 1.5.2: This version contains no new images from v1.5.1 Contains the following improvements: Better support for detecting the installed languages The extract & inject commands won’t run if Visual Studio is running You may now run in extract or inject mode The p/invoke code was cleaned up based on Code Analysis recommendations When a p/invoke method fails the Win32 error message is now displayed Error messages use red text Status messages use green textZXing.Net: ZXing.Net 0.9.0.0: On the way to a release 1.0 the API should be stable now with this version. sync with rev. 2393 of the java version improved api better Unity support Windows RT binaries Windows CE binaries new Windows Service demo new WPF demo WindowsCE Hotfix: Fixes an error with ISO8859-1 encoding and scannning of QR-Codes. The hotfix is only needed for the WindowsCE platform.C.B.R. : Comic Book Reader: CBR 0.7: Synthesis since 0.6 : ePUB : Complete refactoring Add a new dedicated feed viewer for opds stream PDF conversion : improved with image merge Make all backstage panel scrollable Integrate the new AvalonDock 2 library. Support multi-document. Library explorer and Table of content are now toolboxes Designer for dynamic books is now mvvm and much better New BrowserForControl Customized xps viewer to suppress toolbars and bind it to cbr commands Add quick start manual and button ...sosoft: sosoft source: sosoft source include alarm clockRawr: Rawr 5.0.0: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr Addon (NOT UPDATED YET FOR MOP)We now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including ba...Coevery - Free CRM: Coevery 1.0.0.26: The zh-CN issue has been solved. We also add a project management module.New ProjectsAmazonGlacierGUI - GUI Client for Amazon Glacier | WPF: a GUI client for amazon glacier , Written in .NET 4, C#,WPF,MVVM & RavenDB for storageAthene: UML editor.Basic Helpdesk Application: Develop a simple application to be used in a network environment to aid support staff while working remotely with users to resolve problems in their local PC.CAPTCHA Solver: CSolver, is a CAPTCHA Solver for Simple CAPTCHAs.ComicExt: ComicExt Works with CBZ files decompress them into a folder of the same name and convert them into zip files and vice versa from a single folder.Copy Your Table Storage: Tool to permit copy table storage dotCMS: it is a content management system base on BlogEngine.NET and Ext.NETihongma??: ?????????????????????????,????????????????、??????,?????????????????????????????,??????????????????????。 ??????????????,??????;???????????????,???????Liam F: Basic mathematical functions implemented using AVX/AVX2MVC Solution: MVC Solution, including many mvc best practicesopengltest: opengl testsample project: Testing...Sériethèque: A small tvshow organizer with viewed option, link to addict7ed.com, rename file etc ...SHOIC (SUPER HIGH ORBIT ION CANION): SHOIC (Super Hight Orbit Ion Canion) es una aplicación derivada de LOIC (Low Orbit Ion Canion)SignalRServerTry: signal r trySilverlight Datagrid Plus: Silverlight Datagrid plus adds basic features to the Silverlight toolkit datagrid such as grouping, filtering, new record and formatting of the datagrid.Sitecore Modules: This is a home for all the Sitecore Shared Source modules distributed by 5 Limes. Over time we expect this solution to contain several projects.Spiritual Chanting Utility: A quick media loop utilitySQL Web Studio Pad: A Simple Web Based SQL Management IDE for managing Database objectsStockmarket Simulation: Stockmarket Simulationtest_proj: ok testTISSEAN.NET: This project is based on: TISEAN 3.0.1 Nonlinear Time Series Analysis Rainer Hegger,Holger Kantz, Thomas Schreiber With major contributions by Eckehard OlbrichUntangler: Class hierarchy viewer for .Net assemblyValarmathi Hospital: Patient management.Windows 8 App For SharePoint Online: Windows 8 app to SharePoint 2013 OnlineWPF Anti-DataBinding Anti-DataTemplate Extensions: WPF Anti-DataBinding Anti-DataTemplate Extensions offers a pure C# code-behind (no XAML) approach to working with WPF controls, starting with the WPF ListBox.Yandex Positions Parser: ?????? ??????? ????? ? ?????? ??????? ?? ???????? ???????? ??????.YunCMS: A simple CMS for build protal site.

    Read the article

  • How to Run Pam Face Authentication

    - by Supriyo Banerjee
    I am using Ubuntu 11.10. I went to the following URL to download the software 'Pam Face Authentication': http://ppa.launchpad.net/antonio.chiurazzi/ppa/ubuntu/pool/main/p/pam-face-authentication/ and downloaded the version for natty narhwall. I installed the software using the following commands: sudo apt-get install build-essential cmake qt4-qmake libx11-dev libcv-dev libcvaux-dev libhighgui2.1 libhighgui-dev libqt4-dev libpam0g-dev checkinstall cd /tmp && wget http://pam-face-authentication.googlecode.com /files/pam-face-authentication-0.3.tar.gz sudo add-apt-repository ppa:antonio.chiurazzi sudo apt-get update sudo apt-get install pam-face-authentication cat << EOF | sudo tee /usr/share/pam-configs/face_authentication /dev/null Name: face_authentication profile Default: yes Priority: 900 Auth-Type: Primary Auth: [success=end default=ignore] pam_face_authentication.so enableX EOF sudo pam-auth-update --package face_authentication The software installed and I can run the qt-facetrainer. But the problem is when I restarted my system, I saw that the default login screen is appearing where I should put my password to login. The webcam is not starting at all. And I cannot login with my face. Which means I think that pam face authentication programme is not starting at all. Please let me know how I can login with my face using pam face authentication programme.

    Read the article

  • How to properly deny Railo directory access through Apache

    - by Sn3akyP3t3
    I've been battle tested on this and failed to achieve my goal which is to deny all access to all directories except the Public directory and only allow access to all all other directories with specific IP addresses. To get Railo+Apache+Tomcat installed I pretty much followed this script: https://github.com/talltroym/Railo-Ubuntu-Installer-Script then verified settings with this tutorial: http://blog.nictunney.com/2012/03/railo-tomcat-and-apache-on-amazon-ec2.html From the installation script these mods are enabled: sudo a2enmod ssl sudo a2enmod proxy sudo a2enmod proxy_http sudo a2enmod rewrite sudo a2ensite default-ssl Outside of the script I copied the sites-available to sites-enabled then reloaded Apache. I have a directory created for Railo cmfl located at /var/www/Railo/ Navigating the browser to http ://Server_IP_Address/Railo forces ssl and relocates to https ://Server_IP_Address/Railo which shows off index.cfm. Not providing index.cfm and omitting https indicates that the DirectoryIndex directive and RewriteCond of Apache appears to be working for the sites-enabled VirtualHost. The problem I'm encountering is that I cannot seem to deny access to all directories except Public. My directory structure is rather simple and looks like this: Railo error Public NotPublic Sandbox These are my sites-enabled configurations: <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www #Default Deny All to prevent walking backwards in file system Alias /Railo/ "/var/www/Railo/" <Directory ~ ".*/Railo/(?!Public).*"> Order Deny,Allow Deny from All </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> DirectoryIndex index.cfm index.cfml default.cfm default.cfml index.htm index.html index.cfc RewriteEngine on RewriteCond %{SERVER_PORT} !^443$ RewriteRule ^.*$ https://%{SERVER_NAME}%{REQUEST_URI} [L,R] </VirtualHost> and <IfModule mod_ssl.c> <VirtualHost _default_:443> ServerAdmin webmaster@localhost DocumentRoot /var/www Alias /Railo/ "/var/www/Railo/" <Directory ~ "/var/www/Railo/(?!Public).*"> Order Deny,Allow Deny from All </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/ssl_access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> # SSL Engine Switch: # Enable/Disable SSL for this virtual host. SSLEngine on # A self-signed (snakeoil) certificate can be created by installing # the ssl-cert package. See # /usr/share/doc/apache2.2-common/README.Debian.gz for more info. # If both key and certificate are stored in the same file, only the # SSLCertificateFile directive is needed. SSLCertificateFile /etc/ssl/certs/ssl-cert-snakeoil.pem SSLCertificateKeyFile /etc/ssl/private/ssl-cert-snakeoil.key # Server Certificate Chain: # Point SSLCertificateChainFile at a file containing the # concatenation of PEM encoded CA certificates which form the # certificate chain for the server certificate. Alternatively # the referenced file can be the same as SSLCertificateFile # when the CA certificates are directly appended to the server # certificate for convinience. #SSLCertificateChainFile /etc/apache2/ssl.crt/server-ca.crt # Certificate Authority (CA): # Set the CA certificate verification path where to find CA # certificates for client authentication or alternatively one # huge file containing all of them (file must be PEM encoded) # Note: Inside SSLCACertificatePath you need hash symlinks # to point to the certificate files. Use the provided # Makefile to update the hash symlinks after changes. #SSLCACertificatePath /etc/ssl/certs/ #SSLCACertificateFile /etc/apache2/ssl.crt/ca-bundle.crt # Certificate Revocation Lists (CRL): # Set the CA revocation path where to find CA CRLs for client # authentication or alternatively one huge file containing all # of them (file must be PEM encoded) # Note: Inside SSLCARevocationPath you need hash symlinks # to point to the certificate files. Use the provided # Makefile to update the hash symlinks after changes. #SSLCARevocationPath /etc/apache2/ssl.crl/ #SSLCARevocationFile /etc/apache2/ssl.crl/ca-bundle.crl # Client Authentication (Type): # Client certificate verification type and depth. Types are # none, optional, require and optional_no_ca. Depth is a # number which specifies how deeply to verify the certificate # issuer chain before deciding the certificate is not valid. #SSLVerifyClient require #SSLVerifyDepth 10 # Access Control: # With SSLRequire you can do per-directory access control based # on arbitrary complex boolean expressions containing server # variable checks and other lookup directives. The syntax is a # mixture between C and Perl. See the mod_ssl documentation # for more details. #<Location /> #SSLRequire ( %{SSL_CIPHER} !~ m/^(EXP|NULL)/ \ # and %{SSL_CLIENT_S_DN_O} eq "Snake Oil, Ltd." \ # and %{SSL_CLIENT_S_DN_OU} in {"Staff", "CA", "Dev"} \ # and %{TIME_WDAY} >= 1 and %{TIME_WDAY} <= 5 \ # and %{TIME_HOUR} >= 8 and %{TIME_HOUR} <= 20 ) \ # or %{REMOTE_ADDR} =~ m/^192\.76\.162\.[0-9]+$/ #</Location> # SSL Engine Options: # Set various options for the SSL engine. # o FakeBasicAuth: # Translate the client X.509 into a Basic Authorisation. This means that # the standard Auth/DBMAuth methods can be used for access control. The # user name is the `one line' version of the client's X.509 certificate. # Note that no password is obtained from the user. Every entry in the user # file needs this password: `xxj31ZMTZzkVA'. # o ExportCertData: # This exports two additional environment variables: SSL_CLIENT_CERT and # SSL_SERVER_CERT. These contain the PEM-encoded certificates of the # server (always existing) and the client (only existing when client # authentication is used). This can be used to import the certificates # into CGI scripts. # o StdEnvVars: # This exports the standard SSL/TLS related `SSL_*' environment variables. # Per default this exportation is switched off for performance reasons, # because the extraction step is an expensive operation and is usually # useless for serving static content. So one usually enables the # exportation for CGI and SSI requests only. # o StrictRequire: # This denies access when "SSLRequireSSL" or "SSLRequire" applied even # under a "Satisfy any" situation, i.e. when it applies access is denied # and no other module can change it. # o OptRenegotiate: # This enables optimized SSL connection renegotiation handling when SSL # directives are used in per-directory context. #SSLOptions +FakeBasicAuth +ExportCertData +StrictRequire <FilesMatch "\.(cgi|shtml|phtml|php)$"> SSLOptions +StdEnvVars </FilesMatch> <Directory /usr/lib/cgi-bin> SSLOptions +StdEnvVars </Directory> # SSL Protocol Adjustments: # The safe and default but still SSL/TLS standard compliant shutdown # approach is that mod_ssl sends the close notify alert but doesn't wait for # the close notify alert from client. When you need a different shutdown # approach you can use one of the following variables: # o ssl-unclean-shutdown: # This forces an unclean shutdown when the connection is closed, i.e. no # SSL close notify alert is send or allowed to received. This violates # the SSL/TLS standard but is needed for some brain-dead browsers. Use # this when you receive I/O errors because of the standard approach where # mod_ssl sends the close notify alert. # o ssl-accurate-shutdown: # This forces an accurate shutdown when the connection is closed, i.e. a # SSL close notify alert is send and mod_ssl waits for the close notify # alert of the client. This is 100% SSL/TLS standard compliant, but in # practice often causes hanging connections with brain-dead browsers. Use # this only for browsers where you know that their SSL implementation # works correctly. # Notice: Most problems of broken clients are also related to the HTTP # keep-alive facility, so you usually additionally want to disable # keep-alive for those clients, too. Use variable "nokeepalive" for this. # Similarly, one has to force some clients to use HTTP/1.0 to workaround # their broken HTTP/1.1 implementation. Use variables "downgrade-1.0" and # "force-response-1.0" for this. BrowserMatch "MSIE [2-6]" \ nokeepalive ssl-unclean-shutdown \ downgrade-1.0 force-response-1.0 # MSIE 7 and newer should be able to use keepalive BrowserMatch "MSIE [17-9]" ssl-unclean-shutdown DirectoryIndex index.cfm index.cfml default.cfm default.cfml index.htm index.html #Proxy .cfm and cfc requests to Railo ProxyPassMatch ^/(.+.cf[cm])(/.*)?$ http://127.0.0.1:8888/$1 ProxyPassReverse / http://127.0.0.1:8888/ #Deny access to admin except for local clients <Location /railo-context/admin/> Order deny,allow Deny from all # Allow from <Omitted> # Allow from <Omitted> Allow from 127.0.0.1 </Location> </VirtualHost> </IfModule> The apache2.conf includes the following: # Include the virtual host configurations: Include sites-enabled/ <IfModule !mod_jk.c> LoadModule jk_module /usr/lib/apache2/modules/mod_jk.so </IfModule> <IfModule mod_jk.c> JkMount /*.cfm ajp13 JkMount /*.cfc ajp13 JkMount /*.do ajp13 JkMount /*.jsp ajp13 JkMount /*.cfchart ajp13 JkMount /*.cfm/* ajp13 JkMount /*.cfml/* ajp13 # Flex Gateway Mappings # JkMount /flex2gateway/* ajp13 # JkMount /flashservices/gateway/* ajp13 # JkMount /messagebroker/* ajp13 JkMountCopy all JkLogFile /var/log/apache2/mod_jk.log </IfModule> I believe I understand most of this except the jk_module inclusion which I've noticed has an error that shows up in the logs that I can't sort out: [warn] No JkShmFile defined in httpd.conf. Using default /etc/apache2/logs/jk-runtime-status I've checked my Regular expression against the paths of the directories with RegexBuddy just to be sure that I wasn't correct. The problem doesn't appear to be Regex related although I may have something incorrect in the Directory directive. The Location directive seems to be working correctly for blocking out Railo admin site access.

    Read the article

  • LastPass Now Monitors Your Accounts for Security Breaches

    - by Jason Fitzpatrick
    Staying on top of security breaches and how they may or may not affect you is time consuming. Sentry, a new and free addition to the LastPass password management tool, automates the process and notifies you of breaches. In response to all the recent and unfortunate high-profile security breaches LastPass has rolled out Sentry–a tool that monitors breach lists to notify you if your email appears in a list of breached accounts. The lists are supplied by PwnedList, a massive database of security breach data, and securely indexed against your accounts within the LastPass system. If there is a security breach and your email is on the list, you’ll receive an automated email notice indicating which website was compromised and that your email address was one of the positive matches from the breach list. LastPass Sentry is a free feature and, as of yesterday, is automatically activated on all Free, Premium, and Enterprise level accounts. Hit up the link below to read the official announcement. Introducing LastPass Sentry [The LastPass Blog] How To Create a Customized Windows 7 Installation Disc With Integrated Updates How to Get Pro Features in Windows Home Versions with Third Party Tools HTG Explains: Is ReadyBoost Worth Using?

    Read the article

  • Black screen on login, can get thru decrypt disk and access command line but no GUI

    - by t3lf3c
    Running 12.04 64 bit fresh alternative install, with disk crypto on a new Lenovo laptop Install didn't connect and install modules, even though I had the network cable plugged in and don't have any whacky proxy settings. I had to manually install ubunut-desktop and define sources after initial installation, so this seemed a bit weird (ISO matched MD5 sum though) I unplug the network cable, otherwise I get a black screen that I can do nothing with. So I turn laptop on, I have disk encryption, I type in the password at the Ubuntu decryption GUI then get "set up successfully" message "Waiting for network configuration ..." then "Waiting for up to 60 more seconds for network configuration" At this stage (a) If I wait for it then I get a black screen that I can do nothing with. (b) If I interrupt the process by pressing escape, then I break through to the command line. From the command line, I can go ahead and login, then plug my network cable in to do apt-get commands. As a precaution I do some house keeping which takes a few mins to run: sudo apt-get update sudo apt-get upgrade Running startx to get to the GUI gives: Fatal server errror: no screens found The .Xauthority file is being created in my home directory but it's empty. I review my order and note the system graphics: Intel HD Graphics (WWAN or mSATA capable) So it's weird that I can't get to the Gnome. It looks like drivers aren't working. Is there a way of getting Intel drivers from the command line? Or do you have any other suggestions on what to try next?

    Read the article

  • Securing ClickOnce hosted with Amazon S3 Storage

    - by saifkhan
    Well, since my post on hosting ClickOnce with Amazon S3 Storage, I've received quite a few emails asking how to secure the deployment. At the time of this post I regret to say that there is no way to secure your ClickOnce deployment hosted with Amazon S3. The S3 storage is secured by ACL meaning that a username and password will have to be provided before access. The Amazon CloudFront, which sits on top of S3, allows you to apply security settings to your CloudFront distribution by Applying an encryption to the URL. Restricting by IP. The problem with the CloudFront is that the encryption of the URL is mandatory. ClickOnce does not provide a way to pass the "Amazon Public Key" to the CloudFront URL (you probably can if you start editing the XML and HTML files ClickOnce generate but that defeats the porpose of ClickOnce all together). What would be nice is if Amazon can allow users to restrict by IP addresses or IP Blocks. I'd sent them an email and received a response that this is something they are looking into...I won't hold my breadth though. Alternative I suggest you look at Rack Space Cloud hosting http://www.rackspacecloud.com they have very competitive pricing and recently started hosting Windows Virtual Servers. What you can do is rent a virtual server, setup IIS to host your ClickOnce applications. You can then use IIS security setting to restrict what IP/Blocks can access your ClickOnce payloads. Note: You don't really need Windows Server to host ClickOnce. Any web server can do. If you are familiar with Linux you can run that VM with rackspace for half the price of Windows. I hope you found this information helpful.

    Read the article

  • ANNOUNCEMENT: Oracle VM 3 Templates Available for Oracle Secure Global Desktop 4.62

    - by Mohan Prabhala
    Today, we are proud to announce the general availability of Oracle VM 3 templates for Oracle Secure Global Desktop version 4.62.  With Oracle VM 3 templates, anyone using Oracle VM 3 need not download, install and configure the Operating System and product(s) individually. In this case, the supported operating system (Oracle Linux 5.7) and Oracle Secure Global Dekstop 4.62 product is packaged together into a template that one can easily import and clone as a VM into Oracle VM 3. This results in a nearly instant deployment and configuration of Oracle Secure Global Desktop within Oracle VM 3.  This means drastically reducing the evaluation and deployment time for Oracle Secure Global Desktop when leveraging Oracle VM 3. Feel free to give it a try! Login into the Oracle VM section at Oracle Software Delivery Cloud  (click on 'Cloud Portal (Main)' at the top-right) and: Under Oracle VM templates - x86 64-bit, look for Oracle VM 3 Template (OVF) for Oracle Secure Global Desktop Media Pack for x86_64 (64 bit) Oracle Secure Global Desktop 4.62 template for x86_64 (64 bit) with Oracle Linux 5.7 Under Oracle VM templates – x86 32 bit, look for Oracle VM 3 Template (OVF) for Oracle Secure Global Desktop Media Pack for x86 (32 bit) Oracle Secure Global Desktop 4.62 template for x86 (32 bit) with Oracle Linux 5.7 Download any of the above templates. Once you are done, you must First import the assembly (ova) file that you downloaded from Oracle Software Delivery Cloud Next, create a virtual machine template from the assembly And finally create a virtual machine from the template. Once the virtual machine is created and starts up, be sure to configure the networking parameters (hostname, IP address, netmask, gateway etc), and optional user parameters correctly. You must also enter a root password during first boot. And that's it - the Oracle Secure Global Desktop install script will pick up the networking parameters, prompt for confirmation and complete a default installation. Once the installation is complete, you may want to refer to the Oracle Secure Global Desktop Administration Guide to learn more about Oracle Secure Global Desktop and its capabilities.

    Read the article

  • Friday Fun: Wake Up the Box

    - by Mysticgeek
    Another Friday and it’s time to waste the rest of your Friday playing a  fun flash game online. Today we take a look at a relaxing physic based puzzle game called Wake Up the Box. Wake Up the Box This goal of this game is to wake up the box character by attaching parts of existing wood objects in each stage. You can start a new game or continue your progress from where you left off. At the beginning you get a tutorial showing what you need to do to wake the box. You get wood parts and can attach them to other wood pieces but not metal or brick. After successfully waking up Mr. Box, you can go to the next level or restart a level at any time if your having problems figuring out the puzzle. Each level gets more difficult and the puzzles are more challenging. Wake Up the Box is a relaxing and challenging game that will allow you to have fun, not working on TPS reports until the whistle blows. Play Wake Up the Box at FreeWebArcade Similar Articles Productive Geek Tips Stop the Mouse From Waking Up Your Computer from Sleep ModeFix "Sleep Mode Randomly Waking Up" Issue in Windows VistaStop Your Mouse from Waking Up Your Windows 7 ComputerPrevent Windows Asking for a Password on Wake Up from Sleep/StandbyUse Sleep.FM to Wake Up with the Web TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Microsoft’s “How Do I ?” Videos Home Networks – How do they look like & the problems they cause Check Your IMAP Mail Offline In Thunderbird Follow Finder Finds You Twitter Users To Follow Combine MP3 Files Easily QuicklyCode Provides Cheatsheets & Other Programming Stuff

    Read the article

  • How do I get a Canon MG, MP and MX series USB printer working?

    - by Old-linux-fan
    Printer MP6150 driver installed itself upon plugging in the printer. Printer is recognized (lsusb shows it) but does not mount. If the printer is recognized, the driver must be working (or?), but something is blocking the system from mounting the printer. Tried the usual things: power of printer, restart Ubuntu etc. Listed below result of lsusb and fstab: hans@kontor-linux:~$ lsusb Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 006 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 007 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 001 Device 004: ID 04a9:174a Canon, Inc. Bus 002 Device 002: ID 1058:1001 Western Digital Technologies, Inc. External Hard Disk [Elements] Bus 004 Device 002: ID 046d:c517 Logitech, Inc. LX710 Cordless Desktop Laser hans@kontor-linux:~$ sudo cat /etc/fstab [sudo] password for hans: # /etc/fstab: static file system information. # # Use 'blkid' to print the universally unique identifier for a # device; this may be used with UUID= as a more robust way to name devices # that works even if disks are added and removed. See fstab(5). # # <file system> <mount point> <type> <options> <dump> <pass> proc /proc proc nodev,noexec,nosuid 0 0 # / was on /dev/sda6 during installation UUID=eaf3b38d-1c81-4de9-98d4-3834d674ff6e / ext4 errors=remount-ro 0 1 # swap was on /dev/sda5 during installation UUID=93a667d3-6132-45b5-ad51-1f8a46c5b437 none swap sw 0 0 Here is what I have tried: Tried the HK link again, but no luck so far. However, I connected the printer wirelessly to router on other xp box. Installing the driver from ppa:michael-gruz/canon doesn't work, but the driver is installed

    Read the article

  • sudo apt-get update does not work for 12.10

    - by Brian Hawi
    hey i recently installed ubuntu 12.10 but the software center does not work i tried the sudo apt-get update because that worked when i was using ubuntu 11.04.... these are the errors hawi@hawi-HP-G62-Notebook-PC:~$ sudo apt-get update [sudo] password for hawi: Err http:ke.archive.ubuntu.com quantal InRelease Err http:ke.archive.ubuntu.com quantal-updates InRelease Err http:ke.archive.ubuntu.com quantal-backports InRelease Err http:ke.archive.ubuntu.com quantal Release.gpg Unable to connect to ke.archive.ubuntu.com:http: Err http:ke.archive.ubuntu.com quantal-updates Release.gpg Unable to connect to ke.archive.ubuntu.com:http: Err http:ke.archive.ubuntu.com quantal-backports Release.gpg Unable to connect to ke.archive.ubuntu.com:http: Err http:security.ubuntu.com quantal-security InRelease Err http:security.ubuntu.com quantal-security Release.gpg Unable to connect to security.ubuntu.com:http: [IP: 91.189.92.190 80] Err http:extras.ubuntu.com quantal InRelease Err http:extras.ubuntu.com quantal Release.gpg Unable to connect to extras.ubuntu.com:http: Reading package lists... Done W: Failed to fetch http:ke.archive.ubuntu.com/ubuntu/dists/quantal/InRelease W: Failed to fetch http:ke.archive.ubuntu.com/ubuntu/dists/quantal-updates/InRelease W: Failed to fetch http:ke.archive.ubuntu.com/ubuntu/dists/quantal-backports/InRelease W: Failed to fetch http:security.ubuntu.com/ubuntu/dists/quantal-security/InRelease W: Failed to fetch http:extras.ubuntu.com/ubuntu/dists/quantal/InRelease W: Failed to fetch http:ke.archive.ubuntu.com/ubuntu/dists/quantal/Release.gpg Unable to connect to ke.archive.ubuntu.com:http: W: Failed to fetch http:ke.archive.ubuntu.com/ubuntu/dists/quantal-updates/Release.gpg Unable to connect to ke.archive.ubuntu.com:http: W: Failed to fetch http:ke.archive.ubuntu.com/ubuntu/dists/quantal-backports/Release.gpg Unable to connect to ke.archive.ubuntu.com:http: W: Failed to fetch http:security.ubuntu.com/ubuntu/dists/quantal-security/Release.gpg Unable to connect to security.ubuntu.com:http: [IP: 91.189.92.190 80] W: Failed to fetch http:extras.ubuntu.com/ubuntu/dists/quantal/Release.gpg Unable to connect to extras.ubuntu.com:http: W: Some index files failed to download. They have been ignored, or old ones used instead. (note i have removed the // after http because the site does not allow me to post more than two links) what could be the issue?

    Read the article

  • JDBC Connection Pools in Glassfish

    - by Dana Singleterry
    I've been attempting to configure Glassfish 3.1.2.2 for ADF 11g and the need arose to create a jdbc connection pool to my Oracle XE 11g database. While this is really very trivial there were no samples of how to do this and documentation, while good, rarely ever provides concrete examples. After fumbling around for a few minutes searching for an example I gave up and figured it out on my own. Here are the steps for any of you that may be in need. This can be done either via the Glassfish command line tool asadmin or through the admin console. I'm doing this through the admin console. Start Glassfish and connect to the admin console with the credentials you defined at installation: http://localhost:4848 Navigate to Resources | JDBC | JDBC Connection Pools and select New. Be sure to enter Resource Type & Datasource Classname under General Settings tab. You can go with the defaults for Pool Settings etc... View Image Go to the Additional Properties tab and create username, password, and url properties with the respective values. View Image Navigate to Resources | JDBC | JDBC Resources and select New. Be sure to enter the JNDI Name and select the Pool Name for the jdbc connection pool you created previously. View Image Navigate to Configurations | server-config | JVM Settings and select the JVM Options tab. Add the values highlighted: -Doracle.jdbc.J2EE13Compliant=true is used to make sure the driver behaves in a JEE-compliant manner. View Image To integrate the JDBC driver into a GlassFish Server domain, copy the JAR files into the domain-dir/lib directory, then restart the server. The JAR file for the Oracle 11 database driver is ojdbc6dms.jar. An upcoming entry will demonstrate configuring Glassfish for Oracle ADF Applications.

    Read the article

  • Spec Lead call tomorrow - EG Nominations

    - by heathervc
    Tomorrow,  Thursday, 21 June, the PMO will host a call for JSR Spec Leads on the topic of Expert Group (EG) nominations (details below).  The materials and recording of this call will be posted here following the meeting. ------------------------------------------------------- Meeting information ------------------------------------------------------- Topic: SL call on EG nominations Date: Thursday, June 21, 2012 Time: 8:30 am, Pacific Daylight Time (San Francisco, GMT-07:00) Meeting Number: 807 980 273 Meeting Password: nominations ------------------------------------------------------- To start or join the online meeting ------------------------------------------------------- Go to https://jcp.webex.com/jcp/j.php?ED=179196322&UID=491098062&PW=NOWVlZTFiMmRj&RT=MiM0 ------------------------------------------------------- Audio conference information ------------------------------------------------------- +1 866 682 4770 conference code: 4467704 passcode: 1234 For global access numbers, see http://www.intercall.com/oracle/access_numbers.htm or call +1 408 774 4073 ------------------------------------------------------- For assistance ------------------------------------------------------- 1. Go to https://jcp.webex.com/jcp/mc 2. On the left navigation bar, click "Support". To add this meeting to your calendar program (for example Microsoft Outlook), click this link: https://jcp.webex.com/jcp/j.php?ED=179196322&UID=491098062&ICS=MS&LD=1&RD=2&ST=1&SHA2=2s01OCsteUoWfJbblKZYk913m920p54uzc7PTBRx8Do=

    Read the article

  • How to resolve - dpkg error : old pre-removal script returned error exit status 102

    - by Siva Prasad Varma
    I am unable to install or remove a package on my Ubuntu 10.04 due to the following error. $ sudo apt-get autoremove Password: Reading package lists... Done Building dependency tree Reading state information... Done The following packages will be REMOVED: busybox 0 upgraded, 0 newly installed, 1 to remove and 9 not upgraded. 1 not fully installed or removed. Need to get 0B/212kB of archives. After this operation, 627kB disk space will be freed. Do you want to continue [Y/n]? y Selecting previously deselected package nscd. (Reading database ... 235651 files and directories currently installed.) Preparing to replace nscd 2.11.1-0ubuntu7.8 (using .../nscd_2.11.1-0ubuntu7.8_amd64.deb) ... invoke-rc.d: not a symlink: /etc/rc2.d/S76nscd dpkg: warning: old pre-removal script returned error exit status 102 dpkg - trying script from the new package instead ... invoke-rc.d: not a symlink: /etc/rc2.d/S76nscd dpkg: error processing /var/cache/apt/archives/nscd_2.11.1-0ubuntu7.8_amd64.deb (--unpack): subprocess new pre-removal script returned error exit status 102 update-rc.d: warning: /etc/rc2.d/S76nscd is not a symbolic link invoke-rc.d: not a symlink: /etc/rc2.d/S76nscd dpkg: error while cleaning up: subprocess installed post-installation script returned error exit status 102 Errors were encountered while processing: /var/cache/apt/archives/nscd_2.11.1-0ubuntu7.8_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) What should I do to resolve this error? I have tried sudo dpkg --remove --force-remove-reinstreq nscd but it did not work.

    Read the article

  • Cannot connect Ubunto to Windows pc either direction

    - by Frank A
    New to ubuntu and really struggling with this. I want to connect to my old windows XP Pc on our home network. Searching for solutions gets complicated as at one level you are told use Connect to Server. I set to Windows Share, type in server IP address.. ... I get "Failed to retrieve share list from server" Demo on Youtube worked with no problem. Other advice in ask ubuntu is you need to install samba. Did that but nothing seems to happen when I try and run it other than it asking for admin password. (How do you tell what is running on Ubuntu?) So I try the other direction Windows XP to Ubuntu. I made the ubuntu directory within home frank shared and tried various combination such as \ipaddress\home\frank but just "The folder you entered does not appear to be valid. Please choose another." My entire data only drive is shared in Windows and no problems accessing that from other Win XP boxes on our network There are no alerts in Windows firewall, Ubuntu Firestarter did block but changed that to allowed... or so I thought. In firestarter I had set up Inbound traffic policy 192.168.1.1/24. And since then it has added the ip address of the win pc twice. So, I am in a state of confusion not knowing whedre to turn next so thought Ask Ubuntu :)

    Read the article

  • Do or can robots cause considerable performance issues?

    - by Anicho
    So the question in the title is exactly what I am trying to find out. My case is: At work we are in a discussion with team members who seem to think bots will cause us problems relating to performance when running on our services website. Out setup: Lets say I have site www.mysite.co.uk this is a shop window to our online services which sit on www.mysiteonline.co.uk. When people search in google for mysite they see mysiteonline.co.uk as well as mysite.co.uk. Cases against stopping bots crawling: We don't store gb's of data publicly available on the web Most friendly bots, if they were to cause issues would have done so already In our instance the bots can't crawl the site because it requires username & password Stopping bots with robot .txt causes an issue with seo (ref.1) If it was a malicious bot, it would ignore robot.txt or meta tags anyway Ref 1. If we were to block mysiteonline.co.uk from having robots crawl this will affect seo rankings and make it inconvenient for users who actively search for mysite to find mysiteonline. Which we can prove is the case for a good portion of our users.

    Read the article

  • Manage SQL Server Connectivity through Windows Azure Virtual Machines Remote PowerShell

    - by SQLOS Team
    Manage SQL Server Connectivity through Windows Azure Virtual Machines Remote PowerShell Blog This blog post comes from Khalid Mouss, Senior Program Manager in Microsoft SQL Server. Overview The goal of this blog is to demonstrate how we can automate through PowerShell connecting multiple SQL Server deployments in Windows Azure Virtual Machines. We would configure TCP port that we would open (and close) though Windows firewall from a remote PowerShell session to the Virtual Machine (VM). This will demonstrate how to take the advantage of the remote PowerShell support in Windows Azure Virtual Machines to automate the steps required to connect SQL Server in the same cloud service and in different cloud services.  Scenario 1: VMs connected through the same Cloud Service 2 Virtual machines configured in the same cloud service. Both VMs running different SQL Server instances on them. Both VMs configured with remote PowerShell turned on to be able to run PS and other commands directly into them remotely in order to re-configure them to allow incoming SQL connections from a remote VM or on premise machine(s). Note: RDP (Remote Desktop Protocol) is kept configured in both VMs by default to be able to remote connect to them and check the connections to SQL instances for demo purposes only; but not actually required. Step 1 – Provision VMs and Configure Ports   Provision VM1; named DemoVM1 as follows (see examples screenshots below if using the portal):   Provision VM2 (DemoVM2) with PowerShell Remoting enabled and connected to DemoVM1 above (see examples screenshots below if using the portal): After provisioning of the 2 VMs above, here is the default port configurations for example: Step2 – Verify / Confirm the TCP port used by the database Engine By the default, the port will be configured to be 1433 – this can be changed to a different port number if desired.   1. RDP to each of the VMs created below – this will also ensure the VMs complete SysPrep(ing) and complete configuration 2. Go to SQL Server Configuration Manager -> SQL Server Network Configuration -> Protocols for <SQL instance> -> TCP/IP - > IP Addresses   3. Confirm the port number used by SQL Server Engine; in this case 1433 4. Update from Windows Authentication to Mixed mode   5.       Restart SQL Server service for the change to take effect 6.       Repeat steps 3., 4., and 5. For the second VM: DemoVM2 Step 3 – Remote Powershell to DemoVM1 Enter-PSSession -ComputerName condemo.cloudapp.net -Port 61503 -Credential <username> -UseSSL -SessionOption (New-PSSessionOption -SkipCACheck -SkipCNCheck) Your will then be prompted to enter the password. Step 4 – Open 1433 port in the Windows firewall netsh advfirewall firewall add rule name="DemoVM1Port" dir=in localport=1433 protocol=TCP action=allow Output: netsh advfirewall firewall show rule name=DemoVM1Port Rule Name:                            DemoVM1Port ---------------------------------------------------------------------- Enabled:                              Yes Direction:                            In Profiles:                             Domain,Private,Public Grouping:                             LocalIP:                              Any RemoteIP:                             Any Protocol:                             TCP LocalPort:                            1433 RemotePort:                           Any Edge traversal:                       No Action:                               Allow Ok. Step 5 – Now connect from DemoVM2 to DB instance in DemoVM1 Step 6 – Close port 1433 in the Windows firewall netsh advfirewall firewall delete rule name=DemoVM1Port Output: Deleted 1 rule(s). Ok. netsh advfirewall firewall show  rule name=DemoVM1Port No rules match the specified criteria.   Step 7 – Try to connect from DemoVM2 to DB Instance in DemoVM1  Because port 1433 has been closed (in step 6) in the Windows Firewall in VM1 machine, we can longer connect from VM3 remotely to VM1. Scenario 2: VMs provisioned in different Cloud Services 2 Virtual machines configured in different cloud services. Both VMs running different SQL Server instances on them. Both VMs configured with remote PowerShell turned on to be able to run PS and other commands directly into them remotely in order to re-configure them to allow incoming SQL connections from a remote VM or on on-premise machine(s). Note: RDP (Remote Desktop Protocol) is kept configured in both VMs by default to be able to remote connect to them and check the connections to SQL instances for demo purposes only; but not actually needed. Step 1 – Provision new VM3 Provision VM3; named DemoVM3 as follows (see examples screenshots below if using the portal): After provisioning is complete, here is the default port configurations: Step 2 – Add public port to VM1 connect to from VM3’s DB instance Since VM3 and VM1 are not connected in the same cloud service, we will need to specify the full DNS address while connecting between the machines which includes the public port. We shall add a public port 57000 in this case that is linked to private port 1433 which will be used later to connect to the DB instance. Step 3 – Remote Powershell to DemoVM1 Enter-PSSession -ComputerName condemo.cloudapp.net -Port 61503 -Credential <UserName> -UseSSL -SessionOption (New-PSSessionOption -SkipCACheck -SkipCNCheck) You will then be prompted to enter the password.   Step 4 – Open 1433 port in the Windows firewall netsh advfirewall firewall add rule name="DemoVM1Port" dir=in localport=1433 protocol=TCP action=allow Output: Ok. netsh advfirewall firewall show rule name=DemoVM1Port Rule Name:                            DemoVM1Port ---------------------------------------------------------------------- Enabled:                              Yes Direction:                            In Profiles:                             Domain,Private,Public Grouping:                             LocalIP:                              Any RemoteIP:                             Any Protocol:                             TCP LocalPort:                            1433 RemotePort:                           Any Edge traversal:                       No Action:                               Allow Ok.   Step 5 – Now connect from DemoVM3 to DB instance in DemoVM1 RDP into VM3, launch SSM and Connect to VM1’s DB instance as follows. You must specify the full server name using the DNS address and public port number configured above. Step 6 – Close port 1433 in the Windows firewall netsh advfirewall firewall delete rule name=DemoVM1Port   Output: Deleted 1 rule(s). Ok. netsh advfirewall firewall show  rule name=DemoVM1Port No rules match the specified criteria.  Step 7 – Try to connect from DemoVM2 to DB Instance in DemoVM1  Because port 1433 has been closed (in step 6) in the Windows Firewall in VM1 machine, we can no longer connect from VM3 remotely to VM1. Conclusion Through the new support for remote PowerShell in Windows Azure Virtual Machines, one can script and automate many Virtual Machine and SQL management tasks. In this blog, we have demonstrated, how to start a remote PowerShell session, re-configure Virtual Machine firewall to allow (or disallow) SQL Server connections. References SQL Server in Windows Azure Virtual Machines   Originally posted at http://blogs.msdn.com/b/sqlosteam/

    Read the article

< Previous Page | 376 377 378 379 380 381 382 383 384 385 386 387  | Next Page >