Search Results

Search found 9156 results on 367 pages for 'cloud storage'.

Page 355/367 | < Previous Page | 351 352 353 354 355 356 357 358 359 360 361 362  | Next Page >

  • CodePlex Daily Summary for Friday, September 28, 2012

    CodePlex Daily Summary for Friday, September 28, 2012Popular ReleasesWPUtils: WPUtils 1.2: Just fixed an issue related to isolated storage path for ChoosePhotoBehavior. Specifically CreateDirectory method only accepts relative path, but was given a "/photos/" path which would result in exception. Please make sure you have this fix if you are using ChoosePhotoBehavior! NOTE: Windows Phone SDK 7.1 or higher is required.TFS Timesheets: TFS Timesheets 2.0: New features: Visual Studio 2012 support Bug fixes: Scaling mode inherited rather than font scalingCRM 2011 Visual Ribbon Editor: Visual Ribbon Editor 1.1 Beta: Visual Ribbon Editor 1.1 Beta What's New: Fixed scrolling issue in UnHide dialog Added support for connecting via ADFS / IFD Added support for more than one action for a button Added support for empty StringParameter for Javascript functions Fixed bug in rule CrmClientTypeRule when selecting Outlook option Extended Prefix field in New Button dialogFree Aspx Image Gallery: Free Aspx Image Gallery Release V1: This is first basic release of my free aspx image gallery project. It is free to use and modify by the user without any need of providing any credit to me.Simple Microsoft Excel Document Converter (Convert To XLS, XLSX, PDF, XPS): ExcelDocConverter 0.1 Beta: Initial Release Microsoft Excel Documents Converter. Microsoft Excel 1997-2003 (XLS) Microsoft Excel 2007/2010 (XLSX) Portable Document Format (PDF) Microsoft XPS Document (XPS) Difference between NET2.0 and NET3.5 This program uses .NET Framework runtime library to run. Basically, they are no differences. Only the runtime library version is different. For older computers, i.e. Windows XP, might not have .NET Framework 3.5 installed, then use NET2.0 in stead. But, some Windows XP SP2 mig...Office File Properties: Office File Properties 3.3.1: Bug fix. Convert file extension to lowercase before checking.LoBDb.NET: LoBDb.NET 1.0.9: Centido.Core library: 1) SQL Server script bug fix: an error when changing the MaxLength property of an indexed string column or when changing the Precision-Scale properties of a decimal column. LobDb.NET Manager: 1) Changing the Precision, Scale, Default Value, Minimum and Maximum properties of a decimal column now enables the Save button. 2) The MaxLength property of a string column and the Precision+Scale values of a decimal column are now displayed in the column list. 3) Changing the Min...Chaos games: Chaos games: Small app for generating fractals using chaos gamesVisual Studio Icon Patcher: Version 1.5.2: This version contains no new images from v1.5.1 Contains the following improvements: Better support for detecting the installed languages The extract & inject commands won’t run if Visual Studio is running You may now run in extract or inject mode The p/invoke code was cleaned up based on Code Analysis recommendations When a p/invoke method fails the Win32 error message is now displayed Error messages use red text Status messages use green textMCEBuddy 2.x: MCEBuddy 2.2.16: Changelog for 2.2.16 (32bit and 64bit) Now a standalone remote client also available to control the Engine remotely. 1. Added support for remote connections for status and configuration. MCEBuddy now uses port 23332. The remote server name, remote server port and local server port can be updated from the MCEBuddy.conf file BUT the Service or GUI needs to be restarted (i.e. reboot or restart service or restart program) for it to take effect. Refer to documentation for more details http://mce...ZXing.Net: ZXing.Net 0.9.0.0: On the way to a release 1.0 the API should be stable now with this version. sync with rev. 2393 of the java version improved api better Unity support Windows RT binaries Windows CE binaries new Windows Service demo new WPF demo WindowsCE Hotfix: Fixes an error with ISO8859-1 encoding and scannning of QR-Codes. The hotfix is only needed for the WindowsCE platform.SSIS GoogleAnalyticsSource: Version 1.1 Alpha 2: The component uses now the Google API V2.4 including the management API.MVC Bootstrap: MVC Boostrap 0.5.1: A small demo site, based on the default ASP.NET MVC 3 project template, showing off some of the features of MVC Bootstrap. This release uses Entity Framework 5 for data access and Ninject 3 for dependency injection. If you download and use this project, please give some feedback, good or bad!menu4web: menu4web 1.0 - free javascript menu for web sites: menu4web 1.0 has been tested with all major browsers: Firefox, Chrome, IE, Opera and Safari. Minified m4w.js library is less than 9K. Includes 21 menu examples of different styles. Can be freely distributed under The MIT License (MIT).Rawr: Rawr 5.0.0: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr Addon (NOT UPDATED YET FOR MOP)We now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including ba...Coevery - Free CRM: Coevery 1.0.0.26: The zh-CN issue has been solved. We also add a project management module.VidCoder: 1.4.1 Beta: Updated to HandBrake 4971. This should fix some issues with stuck PGS subtitles. Fixed build break which prevented pre-compiled XML serializers from showing up. Fixed problem where a preset would get errantly marked as modified when re-opening the encode settings window or importing a new preset.JSLint for Visual Studio 2010: 1.4.0: VS2012 support is alphaBlackJumboDog: Ver5.7.2: 2012.09.23 Ver5.7.2 (1)InetTest?? (2)HTTP?????????????????100???????????Player Framework by Microsoft: Player Framework for Windows 8 (Preview 6): IMPORTANT: List of breaking changes from preview 5 Added separate samples download with .vsix dependencies instead of source dependencies Support for FreeWheel SmartXML ad responses Support for Smooth Streaming SDK DownloaderPlugins Support for VMAP and TTML polling for live scenarios Support for custom smooth streaming byte stream and scheme handlers Support for new play time and position tracking plugin Added IsLiveChanged event Added AdaptivePlugin.MaxBitrate property Add...New ProjectsChaos games: Small app to generating fractals using chaos gamesDocument Digitalization System: This system will allow the users with on one or more PCs to digitalize pdf files and store it or export it to other file formats.ExternalTokenAnalysisOffline: SPUser?UserToken????????????????。FinalProjectSeniorProject: ***Unfinished*** Senior project build GL Ponpes Selamat Kendal: Aplikasi Akuntansi Sekolah Pondok Pesantren Modern Selamat KendalHealth Care Manager: One of keynote planned for the Brazzaville Microsoft event coming soon.Orchard Commerce History with PayPal: Project expands on Nwazet.Commerce module (and is required for this module to work). Adds a purchase history, product role associations, and PayPal.PDF.NET: PWMIS ?????? Ver 4.5 ???? SMS Egypt: This project is intended to make it easy for people to send SMS to their customers using SMS gateways inside and outside Egypt. Strong Caml: Use the familiar CAML syntax, but now do it in strongly-typed, dynamic code. Just follow Visual Studio's IntelliSense, and your CAML query can't go wrong!TrainingFrameWork: TrainingFrameWork

    Read the article

  • Notifications for Expiring DBSNMP Passwords

    - by Courtney Llamas
    Most user accounts these days have a password profile on them that automatically expires the password after a set number of days.   Depending on your company’s security requirements, this may be as little as 30 days or as long as 365 days, although typically it falls between 60-90 days. For a normal user, this can cause a small interruption in your day as you have to go get your password reset by an admin. When this happens to privileged accounts, such as the DBSNMP account that is responsible for monitoring database availability, it can cause bigger problems. In Oracle Enterprise Manager 12c you may notice the error message “ORA-28002: the password will expire within 5 days” when you connect to a target, or worse you may get “ORA-28001: the password has expired". If you wait too long, your monitoring will fail because the password is locked out. Wouldn’t it be nice if we could get an alert 10 days before our DBSNMP password expired? Thanks to Oracle Enterprise Manager 12c Metric Extensions (ME), you can! See the Oracle Enterprise Manager Cloud Control Administrator’s Guide for more information on Metric Extensions. To create a metric extension, select Enterprise / Monitoring / Metric Extensions, and then click on Create. On the General Properties screen select either Cluster Database or Database Instance, depending on which target you need to monitor.  If you have both RAC and Single instance you may need to create one for each. In this example we will create a Cluster Database metric.  Enter a Name for the ME and a Display Name. Then select SQL for the Adapter.  Adjust the Collection Schedule as desired, for this example we will collect this metric every 1 day. Notice for metric collected every day, we can determine the exact time we want to collect. On the Adapter page, enter the query that you wish to execute.  In this example we will use the query below that specifically checks for the DBSNMP user that is expiring within 10 days. Of course, you can adjust this query to alert for any user that can cause an outage such as an application account or service account such as RMAN. select username, account_status, trunc(expiry_date-sysdate) days_to_expirefrom dba_userswhere username = 'DBSNMP'and expiry_date is not null; The next step is to create the columns to store the data returned from the query.  Click Add and add a column for each of the fields in the same order that data is returned.  The table below will help you complete the column additions. Name Display Name Column Type Value Type Metric Category Unit Username User Name Key String Security AccountStatus Account Status Data String Security DaysToExpire Days Until Expiration Data Number Security Days When creating the DaysToExpire column, you can add a default threshold here for Warning and Critical (say < 10 and 5).  When all columns have been added, click Next. On the Credentials page, you can choose to use the default monitoring credentials or specify new credentials.  We will use the default credentials established for our target (dbsnmp). The next step is to test your Metric Extension.  Click on Add to select a target for testing, then click Select. Now click the button Run Test to execute the test against the selected target(s). We can see in the example below that the Metric Extension has executed and returned a value of 68 days to expire. Click Next to proceed. Review the metric extension in the final screen and click Finish. The metric will be created in Editable status.  Select the metric, click Actions and select Deployable Draft. You can do this once more to move to Published. Finally, we want to apply this metric to a target. When managing many targets, it’s best to add your metric to a template, for details on adding a Metric Extension to a template see the Administrator’s Guide. For this example, we will deploy this to a target directly. Select Actions / Deploy to Targets. Click Add and select the target you wish to deploy to and click Submit.  Once deployment is complete, we can go to the target and view the Metric & Collection Settings to see the new metric and its thresholds.   After some time, you will find the metric has collected and the days to expiration for DBSNMP user can be seen in the All Metrics view.   For metrics collected once per day, you may have to wait up to 24 hours to see the metric and current severity. In the example below, the current severity is Clear (green check) as it is not scheduled to expire within 10 days. To test the notification, we can edit the thresholds for the new metric so they trigger an alert.  Our password expires in 139 days, so we’ll change our Warning to 140 and leave Critical at 5, in our example we also changed the collection time to every 5 minutes.  At the next collection, you’ll find that the current severity changes to a Warning and any related Incident Rules would be triggered to create an Incident or Notification as desired. Now that you get a notification that your DBSNMP passwords is about to expire, you can use OEM Command Line Interface (EM CLI) verb update_db_password to change it at both the database target and the OEM target in one step.  The caveat is you must know the existing password to use the update_db_password command.  To learn more about EM CLI, see the Oracle Enterprise Manager Command Line Interface Guide.  Below is an example of changing the password with the update_db_password verb.  $ ./emcli update_db_password -target_name=emrep -target_type=oracle_database -user_name=dbsnmp -change_at_target=yes -change_all_references=yes Enter value for old_password :Enter value for new_password :Enter value for retype_new_password :Successfully submitted a job to change the password in Enterprise Manager and on the target database: "emrep"Execute "emcli get_jobs -job_id=FA66C1C4D663297FE0437656F20ACC84" to check the status of the job.Search for job name "CHANGE_PWD_JOB_FA66C1C4D662297FE0437656F20ACC84" on the Jobs home page to check job execution details. The subsequent job created will typically run quickly enough that a blackout is not needed, however if you submit a script with many targets to change, your job may run slower so adding a blackout to the script is recommended. $ ./emcli get_jobs -job_id=FA66C1C4D663297FE0437656F20ACC84 Name Type Job ID Execution ID Scheduled Completed TZ Offset Status Status ID Owner Target Type Target Name CHANGE_PWD_JOB_FA66C1C4D662297FE0437656F20ACC84 ChangePassword FA66C1C4D663297FE0437656F20ACC84 FA66C1C4D665297FE0437656F20ACC84 2014-05-28 09:39:12 2014-05-28 09:39:18 GMT-07:00 Succeeded 5 SYSMAN oracle_database emrep After implementing the above Metric Extension and using the EM CLI update_db_password verb, you will be able to stay on top of your DBSNMP password changes without experiencing an unplanned monitoring outage.  

    Read the article

  • Oracle Certification and virtualization Solutions.

    - by scoter
    As stated in official MOS ( My Oracle Support ) document 249212.1 support for Oracle products on non-Oracle VM platforms follow exactly the same stance as support for VMware and, so, the only x86 virtualization software solution certified for any Oracle product is "Oracle VM". Based on the fact that: Oracle VM is totally free ( you have the option to buy Oracle-Support ) Certified is pretty different from supported ( OracleVM is certified, others could be supported ) With Oracle VM you may not require to reproduce your issue(s) on physical server Oracle VM is the only x86 software solution that allows hard-partitioning *** *** see details to these Oracle public links: http://www.oracle.com/technetwork/server-storage/vm/ovm-hardpart-168217.pdf http://www.oracle.com/us/corporate/pricing/partitioning-070609.pdf people started asking to migrate from third party virtualization software (ex. RH KVM, VMWare) to Oracle VM. Migrating RH KVM guest to Oracle VM. OracleVM has a built-in P2V utility ( Official Documentation ) but in some cases we can't use it, due to : network inaccessibility between hypervisors ( KVM and OVM ) network slowness between hypervisors (KVM and OVM) size of the guest virtual-disks Here you'll find a step-by-step guide to "manually" migrate a guest machine from KVM to OVM. 1. Verify source guest characteristics. Using KVM web console you can verify characteristics of the guest you need to migrate, such as: CPU Cores details Defined Memory ( RAM ) Name of your guest Guest operating system Disks details ( number and size ) Network details ( number of NICs and network configuration ) 2. Export your guest in OVF / OVA format.  The export from Redhat KVM ( kernel virtual machine ) will create a structured export of your guest: [root@ovmserver1 mnt]# lltotal 12drwxrwx--- 5 36 36 4096 Oct 19 2012 b8296fca-13c4-4841-a50f-773b5139fcee b8296fca-13c4-4841-a50f-773b5139fcee is the ID of the guest exported from RH-KVM [root@ovmserver1 mnt]# cd b8296fca-13c4-4841-a50f-773b5139fcee/[root@ovmserver1 b8296fca-13c4-4841-a50f-773b5139fcee]# ls -ltrtotal 12drwxr-x--- 4 36 36 4096 Oct 19  2012 masterdrwxrwx--- 2 36 36 4096 Oct 29  2012 dom_mddrwxrwx--- 4 36 36 4096 Oct 31  2012 images images contains your virtual-disks exported [root@ovmserver1 b8296fca-13c4-4841-a50f-773b5139fcee]# cd images/[root@ovmserver1 images]# ls -ltratotal 16drwxrwx--- 5 36 36 4096 Oct 19  2012 ..drwxrwx--- 2 36 36 4096 Oct 31  2012 d4ef928d-6dc6-4743-b20d-568b424728a5drwxrwx--- 2 36 36 4096 Oct 31  2012 4b241ea0-43aa-4f3b-ab7d-2fc633b491a1drwxrwx--- 4 36 36 4096 Oct 31  2012 .[root@ovmserver1 images]# cd d4ef928d-6dc6-4743-b20d-568b424728a5/[root@ovmserver1 d4ef928d-6dc6-4743-b20d-568b424728a5]# ls -ltotal 5169092-rwxr----- 1 36 36 187904819200 Oct 31  2012 4c03b1cf-67cc-4af0-ad1e-529fd665dac1-rw-rw---- 1 36 36          341 Oct 31  2012 4c03b1cf-67cc-4af0-ad1e-529fd665dac1.meta[root@ovmserver1 d4ef928d-6dc6-4743-b20d-568b424728a5]# file 4c03b1cf-67cc-4af0-ad1e-529fd665dac14c03b1cf-67cc-4af0-ad1e-529fd665dac1: LVM2 (Linux Logical Volume Manager) , UUID: sZL1Ttpy0vNqykaPahEo3hK3lGhwspv 4c03b1cf-67cc-4af0-ad1e-529fd665dac1 is the first exported disk ( physical volume ) [root@ovmserver1 d4ef928d-6dc6-4743-b20d-568b424728a5]# cd ../4b241ea0-43aa-4f3b-ab7d-2fc633b491a1/[root@ovmserver1 4b241ea0-43aa-4f3b-ab7d-2fc633b491a1]# ls -ltotal 5568076-rwxr----- 1 36 36 107374182400 Oct 31  2012 9020f2e1-7b8a-4641-8f80-749768cc237a-rw-rw---- 1 36 36          341 Oct 31  2012 9020f2e1-7b8a-4641-8f80-749768cc237a.meta[root@ovmserver1 4b241ea0-43aa-4f3b-ab7d-2fc633b491a1]# file 9020f2e1-7b8a-4641-8f80-749768cc237a9020f2e1-7b8a-4641-8f80-749768cc237a: x86 boot sector; partition 1: ID=0x83, active, starthead 1, startsector 63, 401562 sectors; partition 2: ID=0x82, starthead 0, startsector 401625, 65529135 sectors; startsector 63, 401562 sectors; partition 2: ID=0x82, starthead 0, startsector 401625, 65529135 sectors; partition 3: ID=0x83, starthead 254, startsector 65930760, 8385930 sectors; partition 4: ID=0x5, starthead 254, startsector 74316690, 135395820 sectors, code offset 0x48 9020f2e1-7b8a-4641-8f80-749768cc237a is the second exported disk, with partition 1 bootable 3. Prepare the new guest on Oracle VM. By Ovm-Manager we can prepare the guest where we will move the exported virtual-disks; under the Tab "Servers and VMs": click on  and create your guest with parameters collected before (point 1): - add NICs on different networks: - add virtual-disks; in this case we add two disks of 1.0 GB each one; we will extend the virtual disk copying the source KVM virtual-disk ( see next steps ) - verify virtual-disks created ( under Repositories tab ) 4. Verify OVM virtual-disks names. [root@ovmserver1 VirtualMachines]# grep -r hyptest_rdbms * 0004fb0000060000a906b423f44da98e/vm.cfg:OVM_simple_name = 'hyptest_rdbms' [root@ovmserver1 VirtualMachines]# cd 0004fb0000060000a906b423f44da98e [root@ovmserver1 0004fb0000060000a906b423f44da98e]# more vm.cfgvif = ['mac=00:21:f6:0f:3f:85,bridge=0004fb001089128', 'mac=00:21:f6:0f:3f:8e,bridge=0004fb00101971d'] OVM_simple_name = 'hyptest_rdbms' vnclisten = '127.0.0.1' disk = ['file:/OVS/Repositories/0004fb00000300004f17b7368139eb41/ VirtualDisks/0004fb000012000097c1bfea9834b17d.img,xvda,w', 'file:/OVS/Repositories/0004fb00000300004f17b7368139eb41/VirtualDisks/ 0004fb0000120000cde6a11c3cb1d0be.img,xvdb,w'] vncunused = '1' uuid = '0004fb00-0006-0000-a906-b423f44da98e' on_reboot = 'restart' cpu_weight = 27500 memory = 32768 cpu_cap = 0 maxvcpus = 8 OVM_high_availability = True maxmem = 32768 vnc = '1' OVM_description = '' on_poweroff = 'destroy' on_crash = 'restart' name = '0004fb0000060000a906b423f44da98e' guest_os_type = 'linux' builder = 'hvm' vcpus = 8 keymap = 'en-us' OVM_os_type = 'Oracle Linux 5' OVM_cpu_compat_group = '' OVM_domain_type = 'xen_hvm' disk2 ovm ==> /OVS/Repositories/0004fb00000300004f17b7368139eb41/VirtualDisks/ 0004fb0000120000cde6a11c3cb1d0be.img disk1 ovm ==> /OVS/Repositories/0004fb00000300004f17b7368139eb41/VirtualDisks/ 0004fb000012000097c1bfea9834b17d.img Summarizing disk1 --source ==> /mnt/b8296fca-13c4-4841-a50f-773b5139fcee/images/4b241ea0-43aa-4f3b-ab7d-2fc633b491a1/9020f2e1-7b8a-4641-8f80-749768cc237a disk1 --dest ==> /OVS/Repositories/0004fb00000300004f17b7368139eb41/VirtualDisks/ 0004fb000012000097c1bfea9834b17d.img disk2 --source ==> /mnt/b8296fca-13c4-4841-a50f-773b5139fcee/images/d4ef928d-6dc6-4743-b20d-568b424728a5/4c03b1cf-67cc-4af0-ad1e-529fd665dac1 disk2 --dest ==> /OVS/Repositories/0004fb00000300004f17b7368139eb41/VirtualDisks/ 0004fb0000120000cde6a11c3cb1d0be.img 5. Copy KVM exported virtual-disks to OVM virtual-disks. Keeping your Oracle VM guest stopped you can copy KVM exported virtual-disks to OVM virtual-disks; what I did is only to locally mount the filesystem containing the exported virtual-disk ( by an usb device ) on my OVS; the copy automatically resize OVM virtual-disks ( previously created with a size of 1GB ) . nohup cp /mnt/b8296fca-13c4-4841-a50f-773b5139fcee/images/4b241ea0-43aa-4f3b-ab7d-2fc633b491a1/9020f2e1-7b8a-4641-8f80-749768cc237a /OVS/Repositories/0004fb00000300004f17b7368139eb41/VirtualDisks/0004fb000012000097c1bfea9834b17d.img & nohup cp /mnt/b8296fca-13c4-4841-a50f-773b5139fcee/images/d4ef928d-6dc6-4743-b20d-568b424728a5/4c03b1cf-67cc-4af0-ad1e-529fd665dac1 /OVS/Repositories/0004fb00000300004f17b7368139eb41/VirtualDisks/0004fb0000120000cde6a11c3cb1d0be.img & 7. When copy completed refresh repository to aknowledge the new-disks size. 7. After "refresh repository" is completed, start guest machine by Oracle VM manager. After the first start of your guest: - verify that you can see all disks and partitions - verify that your guest is network reachable ( MAC Address of your NICs changed ) Eventually you can also evaluate to convert your guest to PVM ( Paravirtualized virtual Machine ) following official Oracle documentation. Ciao Simon COTER ps: next-time I'd like to post an article reporting how to manually migrate Virtual-Iron guests to OracleVM.  Comments and corrections are welcome. 

    Read the article

  • PASS Summit 2010 BI Workshop Feedbacks

    - by Davide Mauri
    As many other speakers already did, I’d like to share with the SQL Community the feedback of my PASS Summit 2010 Workshop. For those who were not there, my workshop was the “BI From A-Z” and the main objective of that workshop was to introduce people in the BI world not only from a technical point of view but insist a lot on the methodological and “engineered” approach. The will to put more engineering in the IT (and specially in the BI field) is something that has been growing stronger and stronger in me every day for of this last 5 years since is simply envy the fact that Airbus, Fincatieri, BMW (just to name a few) can create very complex machine “just” using putting people together and giving them some rules to follow (Of course this is an oversimplification but I think you get what I mean). The key point of engineering is that, after having defined the project blueprint, you have the possibility to give to a huge number of people, the rules to follow, the correct tools in order to implement the rules easily and semi-automatically and a way to measure the quality of the results. Could this be done in IT? Very big question, so my scope is now limited to BI. So that’s the main point of my workshop: and entry-level approach to BI (level was 200) in order to allow attendees to know the basics, to understand what tools they should use for which purpose and, above all, a set of rules and tools in order to make a BI solution scalable in terms of people working on it, while still maintaining a very good quality. All done not focusing only on the practice but explaining the theory behind to see how it can help *a lot* to build a correct solution despite the technology used to implement it. The idea is to reach a point where more then 70% of the work done to create a BI solution can be reused even if technologies changes. This is a very demanding challenge nowadays with the coming of Denali and its column-aligned storage and the shiny-new DAX language. As you may understand I was looking forward to get the feedback since you may have noticed that there’s a lot of “architectural” stuff in IT but really nothing on “engineering”. So how the session could be perceived by the attendees was really unknown to me. The feedback could also give a good indication if the need of more “engineering” is something I feel only by myself or if is something more broad. I’m very happy to be able to say that the overall score of 4.75 put my workshop in the TOP 20 session (on near 200 sessions)! Here’s the detailed evaluations: How would you rate the usefulness of the information presented in your day-to-day environment? 4.75 Answer:    # of Responses 3    1         4    12        5    42               How would you rate the Speaker's presentation skills? 4.80 Answer:    # of Responses 3 : 1         4 : 9         5 : 45               How would you rate the Speaker's knowledge of the subject? 4.95 Answer:    # of Responses 4 :  3         5 : 52               How would you rate the accuracy of the session title, description and experience level to the actual session? 4.75 Answer:    # of Responses 3 : 2         4 : 10         5 : 43               How would you rate the amount of time allocated to cover the topic/session? 4.44 Answer:    # of Responses 3 : 7         4 : 17        5 : 31               How would you rate the quality of the presentation materials? 4.62 Answer:    # of Responses 4 : 21        5 : 34 The comments where all very positive. Many of them asked for more time on the subject (or to shorten the very last topics). I’ll make treasure of these comments and will review the content accordingly. We’ll organize a two-day classes on this topic, where also more examples will be shown and some arguments will be explained more deeply. I’d just like to answer a comment that asks how much of what I shown is “universally applicable”. I can tell you that all of our BI project follow these rules and they’ve been applied to different markets (Insurance, Fashion, GDO) with different people and different teams and they allowed us to be “Adaptive” against the customer. The more the rules are well defined and the more there are tools that supports their implementations, the easier is to add new people to the project and to add or change solution features. Think of a car. How come that almost any mechanic can help you to fix a problem? Because they know what to expect. Because there a rules that allow them to identify the problem without having to discover each time how the car has been implemented build. And this is of course also true for car upgrades/improvements. Last but not least: thanks a lot to everyone for coming!

    Read the article

  • Azure Task Scheduling Options

    - by charlie.mott
    Currently, the Azure PaaS does not offer a distributed\resilient task scheduling service.  If you do want to host a task scheduling product\solution off-premise (and ideally use Azure), what are your options? PaaS Option 1: Worker Roles Use a worker role to schedule and execute actions at specific time periods.  There are a few frameworks available to assist with this: http://azuretoolkit.codeplex.com https://github.com/Lokad/lokad-cloud/wiki/TaskScheduler http://blog.smarx.com/posts/building-a-task-scheduler-in-windows-azure - This addresses a slightly different set of requirements. It’s a more dynamic approach for queuing up tasks, but not repeatable tasks (e.g. daily). I found the Azure Toolkit option the most simple to implement.  Step 1 : Create a domain entity implementing IJob for each job to schedule.  In this sample, I asynchronously call a WCF service method. 1: namespace Acme.WorkerRole.Jobs 2: { 3: using AzureToolkit; 4: using ScheduledTasksService; 5: 6: public class UploadEmployeesJob : IJob 7: { 8: public void Run() 9: { 10: // Call Tasks Service 11: var client = new ScheduledTasksServiceClient("BasicHttpBinding_IScheduledTasksService"); 12: client.UploadEmployees(); 13: client.Close(); 14: } 15: } 16: } Step 2 : In the worker role run method, add the jobs to the toolkit engine. 1: namespace Acme.WorkerRole 2: { 3: using AzureToolkit.Engine; 4: using Jobs; 5:   6: public class WorkerRole : WorkerRoleEntryPoint 7: { 8: public override void Run() 9: { 10: var engine = new CloudEngine(); 11:   12: // Add Scheduled Jobs (using CronJob syntax - see http://www.adminschoice.com/crontab-quick-reference). 13:   14: // 1. Upload Employee job - 8.00 PM every weekday (Mon-Fri) 15: engine.WithJobScheduler().ScheduleJob<UploadEmployeesJob>(c => { c.CronSchedule = "0 20 * * 1-5"; }); 16: // 2. Purge Data job - 10 AM every Saturday 17: engine.WithJobScheduler().ScheduleJob<PurgeDataJob>(c => { c.CronSchedule = "0 10 * * 6"; }); 18: // 3. Process Exceptions job - Every 5 minutes 19: engine.WithJobScheduler().ScheduleJob<ProcessExceptionsJob>(c => { c.CronSchedule = "*/5 * * * *"; }); 20:   21: engine.Run(); 22: base.Run(); 23: } 24: } 25: } Pros Cons Azure Toolkit option is simple to implement. For the AzureToolkit option, you are limited to a single worker role.  Otherwise, the jobs will be executed multiple times, once for each worker role instance.   Paying for a continuously running worker role, even if it just processes a single job once a week.  If you only have a few scheduled tasks to run calling asynchronous services hosted in different web roles, an extra small worker role likely to be sufficient.  However, for an extra small worker role this still costs $14.40/month (03/09/2012). Option 2: Use Scheduled Task on Azure Web Role calling a console app Setup a Windows Scheduled Task on the Azure Web Role. This calls a console application that calls the WCF service methods that run the task actions. This design is described here: http://www.ronaldwidha.net/2011/02/23/cron-job-on-azure-using-scheduled-task-on-a-web-role-to-replace-azure-worker-role-for-background-job/ http://www.voiceoftech.com/swhitley/index.php/2011/07/windows-azure-task-scheduler/ http://devlicio.us/blogs/vinull/archive/2011/10/23/moving-to-azure-worker-roles-for-nothing-and-tasks-for-free.aspx Pros Cons Fairly easy to implement. Supportability - I RDC’ed onto the Azure server and stopped the scheduled task. I then rebooted the machine and the task was re-started. I also tried deleting the task and rebooting, the same thing occurred. The only way to permanently guarantee that a task is disabled is to do a fresh deployment. I think this is a major supportability concern.   Saleability - multiple instances would trigger multiple tasks. You can only have one instance for the scheduled task web role. The guidance implements setup of the scheduled task as part of a web role instance. But if you have more than one instance in a web role, the task will be triggered multiple times for each scheduled action (once per machine). Workaround: If we wanted to use scheduled tasks for another client with a saleable WCF service, then we could include the console & tasks scripts in a separate web role (e.g. a empty WCF service with no real purpose to it). SaaS Option 3: Azure Marketplace I thought that someone might be offering this type of service via the Azure marketplace. At the point of writing this blog post, I did not find anyone doing so. https://datamarket.azure.com/ Pros Cons   Nobody currently offers this on the Azure Marketplace. Option 4: Online Job Scheduling Service Provider There are plenty of online providers that offer this type of service on a pay-as-you-go approach.  Some of these are free for small usage.   Many of these providers are listed here: http://en.wikipedia.org/wiki/Webcron Pros Cons No bespoke development for scheduler. Reliance on third party. IaaS Option 5: Setup Scheduling Software on Azure IaaS VM’s One of job scheduling software offerings could be installed and configured on Azure VM’s.  A list of software options is listed here: http://en.wikipedia.org/wiki/List_of_job_scheduler_software Pros Cons Enterprise distributed\resilient task scheduling service VM Setup and maintenance   Software Licence Costs Option 6: VM Gallery A the time of writing this blog post, I did not spot a VM in the gallery that included pre-installation of any of the above software options. Pros Cons   No current VM template. Summary For my current project that had a small handful of tasks to schedule with a limited project budget I chose option 1 (a worker role using the Azure Toolkit to schedule tasks).  If I was building an enterprise scale solution for the future, options 4 and 5 are currently worthy of consideration. Hopefully, Microsoft will include tasks scheduling in the future as part of their PaaS offerings.

    Read the article

  • SharePoint logging to a list

    - by Norgean
    I recently worked in an environment with several servers. Locating the correct SharePoint log file for error messages, or development trace calls, is cumbersome. And once the solution hit the cloud, it got even worse, as we had no access to the log files at all. Obviously we are not the only ones with this problem, and the current trend seems to be to log to a list. This had become an off-hour project, so rather than do the sensible thing and find a ready-made solution, I decided to do it the hard way. So! Fire up Visual Studio, create yet another empty SharePoint solution, and start to think of some requirements. Easy on/offI want to be able to turn list-logging on and off.Easy loggingFor me, this means being able to use string.Format.Easy filteringLet's have the possibility to add some filtering columns; category and severity, where severity can be "verbose", "warning" or "error". Easy on/off Well, that's easy. Create a new web feature. Add an event receiver, and create the list on activation of the feature. Tear the list down on de-activation. I chose not to create a new content type; I did not feel that it would give me anything extra. I based the list on the generic list - I think a better choice would have been the announcement type. Approximately: public void CreateLog(SPWeb web)         {             var list = web.Lists.TryGetList(LogListName);             if (list == null)             {                 var listGuid = web.Lists.Add(LogListName, "Logging for the masses", SPListTemplateType.GenericList);                 list = web.Lists[listGuid];                 list.Title = LogListTitle;                 list.Update();                 list.Fields.Add(Category, SPFieldType.Text, false);                 var stringColl = new StringCollection();                 stringColl.AddRange(new[]{Error, Information, Verbose});                 list.Fields.Add(Severity, SPFieldType.Choice, true, false, stringColl);                 ModifyDefaultView(list);             }         }Should be self explanatory, but: only create the list if it does not already exist (d'oh). Best practice: create it with a Url-friendly name, and, if necessary, give it a better title. ...because otherwise you'll have to look for a list with a name like "Simple_x0020_Log". I've added a couple of fields; a field for category, and a 'severity'. Both to make it easier to find relevant log messages. Notice that I don't have to call list.Update() after adding the fields - this would cause a nasty error (something along the lines of "List locked by another user"). The function for deleting the log is exactly as onerous as you'd expect:         public void DeleteLog(SPWeb web)         {             var list = web.Lists.TryGetList(LogListTitle);             if (list != null)             {                 list.Delete();             }         } So! "All" that remains is to log. Also known as adding items to a list. Lots of different methods with different signatures end up calling the same function. For example, LogVerbose(web, message) calls LogVerbose(web, null, message) which again calls another method which calls: private static void Log(SPWeb web, string category, string severity, string textformat, params object[] texts)         {             if (web != null)             {                 var list = web.Lists.TryGetList(LogListTitle);                 if (list != null)                 {                     var item = list.AddItem(); // NOTE! NOT list.Items.Add… just don't, mkay?                     var text = string.Format(textformat, texts);                     if (text.Length > 255) // because the title field only holds so many chars. Sigh.                         text = text.Substring(0, 254);                     item[SPBuiltInFieldId.Title] = text;                     item[Degree] = severity;                     item[Category] = category;                     item.Update();                 }             } // omitted: Also log to SharePoint log.         } By adding a params parameter I can call it as if I was doing a Console.WriteLine: LogVerbose(web, "demo", "{0} {1}{2}", "hello", "world", '!'); Ok, that was a silly example, a better one might be: LogError(web, LogCategory, "Exception caught when updating {0}. exception: {1}", listItem.Title, ex); For performance reasons I use list.AddItem rather than list.Items.Add. For completeness' sake, let us include the "ModifyDefaultView" function that I deliberately skipped earlier.         private void ModifyDefaultView(SPList list)         {             // Add fields to default view             var defaultView = list.DefaultView;             var exists = defaultView.ViewFields.Cast<string>().Any(field => String.CompareOrdinal(field, Severity) == 0);               if (!exists)             {                 var field = list.Fields.GetFieldByInternalName(Severity);                 if (field != null)                     defaultView.ViewFields.Add(field);                 field = list.Fields.GetFieldByInternalName(Category);                 if (field != null)                     defaultView.ViewFields.Add(field);                 defaultView.Update();                   var sortDoc = new XmlDocument();                 sortDoc.LoadXml(string.Format("<Query>{0}</Query>", defaultView.Query));                 var orderBy = (XmlElement) sortDoc.SelectSingleNode("//OrderBy");                 if (orderBy != null && sortDoc.DocumentElement != null)                     sortDoc.DocumentElement.RemoveChild(orderBy);                 orderBy = sortDoc.CreateElement("OrderBy");                 sortDoc.DocumentElement.AppendChild(orderBy);                 field = list.Fields[SPBuiltInFieldId.Modified];                 var fieldRef = sortDoc.CreateElement("FieldRef");                 fieldRef.SetAttribute("Name", field.InternalName);                 fieldRef.SetAttribute("Ascending", "FALSE");                 orderBy.AppendChild(fieldRef);                   fieldRef = sortDoc.CreateElement("FieldRef");                 field = list.Fields[SPBuiltInFieldId.ID];                 fieldRef.SetAttribute("Name", field.InternalName);                 fieldRef.SetAttribute("Ascending", "FALSE");                 orderBy.AppendChild(fieldRef);                 defaultView.Query = sortDoc.DocumentElement.InnerXml;                 //defaultView.Query = "<OrderBy><FieldRef Name='Modified' Ascending='FALSE' /><FieldRef Name='ID' Ascending='FALSE' /></OrderBy>";                 defaultView.Update();             }         } First two lines are easy - see if the default view includes the "Severity" column. If it does - quit; our job here is done.Adding "severity" and "Category" to the view is not exactly rocket science. But then? Then we build the sort order query. Through XML. The lines are numerous, but boring. All to achieve the CAML query which is commented out. The major benefit of using the dom to build XML, is that you may get compile time errors for spelling mistakes. I say 'may', because although the compiler will not let you forget to close a tag, it will cheerfully let you spell "Name" as "Naem". Whichever you prefer, at the end of the day the view will sort by modified date and ID, both descending. I added the ID as there may be several items with the same time stamp. So! Simple logging to a list, with sensible a view, and with normal functionality for creating your own filterings. I should probably have added some more views in code, ready filtered for "only errors", "errors and warnings" etc. And it would be nice to block verbose logging completely, but I'm not happy with the alternatives. (yetanotherfeature or an admin page seem like overkill - perhaps just removing it as one of the choices, and not log if it isn't there?) Before you comment - yes, try-catches have been removed for clarity. There is nothing worse than having a logging function that breaks your site!

    Read the article

  • Entity Framework 6: Alpha2 Now Available

    - by ScottGu
    The Entity Framework team recently announced the 2nd alpha release of EF6.   The alpha 2 package is available for download from NuGet. Since this is a pre-release package make sure to select “Include Prereleases” in the NuGet package manager, or execute the following from the package manager console to install it: PM> Install-Package EntityFramework -Pre This week’s alpha release includes a bunch of great improvements in the following areas: Async language support is now available for queries and updates when running on .NET 4.5. Custom conventions now provide the ability to override the default conventions that Code First uses for mapping types, properties, etc. to your database. Multi-tenant migrations allow the same database to be used by multiple contexts with full Code First Migrations support for independently evolving the model backing each context. Using Enumerable.Contains in a LINQ query is now handled much more efficiently by EF and the SQL Server provider resulting greatly improved performance. All features of EF6 (except async) are available on both .NET 4 and .NET 4.5. This includes support for enums and spatial types and the performance improvements that were previously only available when using .NET 4.5. Start-up time for many large models has been dramatically improved thanks to improved view generation performance. Below are some additional details about a few of the improvements above: Async Support .NET 4.5 introduced the Task-Based Asynchronous Pattern that uses the async and await keywords to help make writing asynchronous code easier. EF 6 now supports this pattern. This is great for ASP.NET applications as database calls made through EF can now be processed asynchronously – avoiding any blocking of worker threads. This can increase scalability on the server by allowing more requests to be processed while waiting for the database to respond. The following code shows an MVC controller that is querying a database for a list of location entities:     public class HomeController : Controller     {         LocationContext db = new LocationContext();           public async Task<ActionResult> Index()         {             var locations = await db.Locations.ToListAsync();               return View(locations);         }     } Notice above the call to the new ToListAsync method with the await keyword. When the web server reaches this code it initiates the database request, but rather than blocking while waiting for the results to come back, the thread that is processing the request returns to the thread pool, allowing ASP.NET to process another incoming request with the same thread. In other words, a thread is only consumed when there is actual processing work to do, allowing the web server to handle more concurrent requests with the same resources. A more detailed walkthrough covering async in EF is available with additional information and examples. Also a walkthrough is available showing how to use async in an ASP.NET MVC application. Custom Conventions When working with EF Code First, the default behavior is to map .NET classes to tables using a set of conventions baked into EF. For example, Code First will detect properties that end with “ID” and configure them automatically as primary keys. However, sometimes you cannot or do not want to follow those conventions and would rather provide your own. For example, maybe your primary key properties all end in “Key” instead of “Id”. Custom conventions allow the default conventions to be overridden or new conventions to be added so that Code First can map by convention using whatever rules make sense for your project. The following code demonstrates using custom conventions to set the precision of all decimals to 5. As with other Code First configuration, this code is placed in the OnModelCreating method which is overridden on your derived DbContext class:         protected override void OnModelCreating(DbModelBuilder modelBuilder)         {             modelBuilder.Properties<decimal>()                 .Configure(x => x.HasPrecision(5));           } But what if there are a couple of places where a decimal property should have a different precision? Just as with all the existing Code First conventions, this new convention can be overridden for a particular property simply by explicitly configuring that property using either the fluent API or a data annotation. A more detailed description of custom code first conventions is available here. Community Involvement I blogged a while ago about EF being released under an open source license.  Since then a number of community members have made contributions and these are included in EF6 alpha 2. Two examples of community contributions are: AlirezaHaghshenas contributed a change that increases the startup performance of EF for larger models by improving the performance of view generation. The change means that it is less often necessary to use of pre-generated views. UnaiZorrilla contributed the first community feature to EF: the ability to load all Code First configuration classes in an assembly with a single method call like the following: protected override void OnModelCreating(DbModelBuilder modelBuilder) {        modelBuilder.Configurations            .AddFromAssembly(typeof(LocationContext).Assembly); } This code will find and load all the classes that inherit from EntityTypeConfiguration<T> or ComplexTypeConfiguration<T> in the assembly where LocationContext is defined. This reduces the amount of coupling between the context and Code First configuration classes, and is also a very convenient shortcut for large models. Other upcoming features coming in EF 6 Lots of information about the development of EF6 can be found on the EF CodePlex site, including a roadmap showing the other features that are planned for EF6. One of of the nice upcoming features is connection resiliency, which will automate the process of retying database operations on transient failures common in cloud environments and with databases such as the Windows Azure SQL Database. Another often requested feature that will be included in EF6 is the ability to map stored procedures to query and update operations on entities when using Code First. Summary EF6 is the first open source release of Entity Framework being developed in CodePlex. The alpha 2 preview release of EF6 is now available on NuGet, and contains some really great features for you to try. The EF team are always looking for feedback from developers - especially on the new features such as custom Code First conventions and async support. To provide feedback you can post a comment on the EF6 alpha 2 announcement post, start a discussion or file a bug on the CodePlex site. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Adventures in Windows 8: Working around the navigation animation issues in LayoutAwarePage

    - by Laurent Bugnion
    LayoutAwarePage is a pretty cool add-on to Windows 8 apps, which facilitates greatly the implementation of orientation-aware (portrait, landscape) as well as state-aware (snapped, filled, fullscreen) apps. It has however a few issues that are obvious when you use transformed elements on your page. Adding a LayoutAwarePage to your application If you start with a blank app, the MainPage is a vanilla Page, with no such feature. In order to have a LayoutAwarePage into your app, you need to add this class (and a few helpers) with the following operation: Right click on the Solution and select Add, New Item from the context menu. From the dialog, select a Basic Page (not a Blank Page, which is another vanilla page). If you prefer, you can also use Split Page, Items Page, Item Detail Page, Grouped Items Page or Group Detail Page which are all LayoutAwarePages. Personally I like to start with a Basic Page, which gives me more creative freedom. Adding this new page will cause Visual Studio to show a prompt asking you for permission to add additional helper files to the Common folder. One of these helpers in the LayoutAwarePage class, which is where the magic happens. LayoutAwarePage offers some help for the detection of orientation and state (which makes it a pleasure to design for all these scenarios in Blend, by the way) as well as storage for the navigation state (more about that in a future article). Issue with LayoutAwarePage When you use UI elements such as a background picture, a watermark label, logos, etc, it is quite common to do a few things with those: Making them partially transparent (this is especially true for background pictures; for instance I really like a black Page background with a half transparent picture placed on top of it). Transforming them, for instance rotating them a bit, scaling them, etc. Here is an example with a picture of my two beautiful daughters in the Bird Park in Kuala Lumpur, as well as a transformed TextBlock. The image has an opacity of 40% and the TextBlock a simple RotateTransform. If I create an application with a MainPage that navigates to this LayoutAwarePage, however, I will have a very annoying effect: The background picture appears with an Opacity of 100%. The TextBlock is not rotated. This lasts only for less than a second (during the navigation animation) before the elements “snap into place” and get their desired effect. Here is the XAML that cause the annoying effect: <common:LayoutAwarePage x:Name="pageRoot" x:Class="App13.BasicPage1" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:common="using:App13.Common" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" mc:Ignorable="d"> <Grid Style="{StaticResource LayoutRootStyle}"> <Grid.RowDefinitions> <RowDefinition Height="140" /> <RowDefinition Height="*" /> </Grid.RowDefinitions> <Image Source="Assets/el20120812025.jpg" Stretch="UniformToFill" Opacity="0.4" Grid.RowSpan="2" /> <Grid> <Grid.ColumnDefinitions> <ColumnDefinition Width="Auto" /> <ColumnDefinition Width="*" /> </Grid.ColumnDefinitions> <Button x:Name="backButton" Click="GoBack" IsEnabled="{Binding Frame.CanGoBack, ElementName=pageRoot}" Style="{StaticResource BackButtonStyle}" /> <TextBlock x:Name="pageTitle" Grid.Column="1" Text="Welcome" Style="{StaticResource PageHeaderTextStyle}" /> </Grid> <TextBlock HorizontalAlignment="Center" TextWrapping="Wrap" Text="Welcome to my Windows 8 Application" Grid.Row="1" VerticalAlignment="Bottom" FontFamily="Segoe UI Light" FontSize="70" FontWeight="Light" TextAlignment="Center" Foreground="#FFFFA200" RenderTransformOrigin="0.5,0.5" UseLayoutRounding="False" d:LayoutRounding="Auto" Margin="0,0,0,153"> <TextBlock.RenderTransform> <CompositeTransform Rotation="-6.545" /> </TextBlock.RenderTransform> </TextBlock> <VisualStateManager.VisualStateGroups> [...] </VisualStateManager.VisualStateGroups> </Grid> </common:LayoutAwarePage> Solving the issue In order to solve this “snapping” issue, the solution is to wrap the elements that are transformed into an empty Grid. Honestly, to me it sounds like a bug in the LayoutAwarePage navigation animation, but thankfully the workaround is not that difficult: Simple change the main Grid as follows: <Grid Style="{StaticResource LayoutRootStyle}"> <Grid.RowDefinitions> <RowDefinition Height="140" /> <RowDefinition Height="*" /> </Grid.RowDefinitions> <Grid Grid.RowSpan="2"> <Image Source="Assets/el20120812025.jpg" Stretch="UniformToFill" Opacity="0.4" /> </Grid> <Grid> <Grid.ColumnDefinitions> <ColumnDefinition Width="Auto" /> <ColumnDefinition Width="*" /> </Grid.ColumnDefinitions> <Button x:Name="backButton" Click="GoBack" IsEnabled="{Binding Frame.CanGoBack, ElementName=pageRoot}" Style="{StaticResource BackButtonStyle}" /> <TextBlock x:Name="pageTitle" Grid.Column="1" Text="Welcome" Style="{StaticResource PageHeaderTextStyle}" /> </Grid> <Grid Grid.Row="1"> <TextBlock HorizontalAlignment="Center" TextWrapping="Wrap" Text="Welcome to my Windows 8 Application" VerticalAlignment="Bottom" FontFamily="Segoe UI Light" FontSize="70" FontWeight="Light" TextAlignment="Center" Foreground="#FFFFA200" RenderTransformOrigin="0.5,0.5" UseLayoutRounding="False" d:LayoutRounding="Auto" Margin="0,0,0,153"> <TextBlock.RenderTransform> <CompositeTransform Rotation="-6.545" /> </TextBlock.RenderTransform> </TextBlock> </Grid> <VisualStateManager.VisualStateGroups> [...] </Grid> Hopefully this will help a few people, I banged my head on the wall for a while before someone at Microsoft pointed me to the solution ;) Happy coding, Laurent   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • Nokia Windows Phone 8 App Collection

    - by Tim Murphy
    I recently upgraded to a Nokia Lumia 920.  Along with it came the availability of a number of Nokia developed apps or apps that Nokia has made available from other developers.  Below is a summary of some of the ones that I have used to this point.  There are quite a few of them so I won’t be covering everything that is available. Nokia Maps I am quite pleased with the accuracy of Nokia Maps and not having to tap the screen for each turn any more.  The information on the screen is quite good as well.  The couple of improvements I would like to see are for the voice directions to include which street or exit you need to use and improve the search accuracy.  Bing maps had much better search results in my opinion. Nokia Drive This one really had me confused when I first setup the phone.  I was driving down the road and suddenly I am getting notification tones, but there were no visual notifications on the phone.  It seems that in their infinite wisdom Nokia thinks I don’t know when I am going over the speed limit and need to be told. ESPN I really liked my ESPN app on Windows Phone 7.5, but I am not getting the type of experience I was looking for out of this app.  While it allows me to pick my favorite teams, but there isn’t a pivot page or panorama page that shows a summary of my favorite teams.  I have also found that the live tile don’t update very often.  Over all I am rather disappointed compared app produced by ESPN. Smart Shoot I really need to get the kids to let me use this on.  I like the concept, but I need to spend more time with it.  The idea how running the camera through a continuous shooting mode and then picking the best is something that I have done with my DSLR and am glad to see it available here. Cinemagraph Here is a fun filter.  It doesn’t have the most accurate editing features, but it is fun to stop certain parts of a scene and let other parts move.  As a test I stopped the traffic on the highway and let the traffic on the frontage road flow.  It makes for a fun effect.  If nothing else it could be great for sending prank animations to your friends. YouSendIt I have only briefly touched this application.  What I don’t understand is why it is needed.  Most of the functionality seems to be similar to SkyDrive and it gives you less storage.  They only feature that seems to differentiate the app is the signature capability. Creative Studio This app has some nice quick edits, but it is not very comprehensive.  I am also not to thrilled with the user experience.  It puts you though an initial color cast series that I’m not sure why it is there.  Discovery of the remaining adjustments isn’t that great.  In the end I found myself wanting Thumbia back. Panorama This is one of the apps that I like.  I found it easy to use as it guides you with a target circle that you center for it to take the next pictures.  It also stitches the images with amazing speed.  The one thing I wish it had was the capability to turn the phone into portrait orientation and do a taller panorama.  Perhaps we will see this in the future. Nokia Music After getting over the missing album art I found that there were a number of missing features with this app as well.  I have a Zune HD and I am used to being able to go through my collection and adding songs, albums or artists to my now playing.  There also doesn’t seem to be a way to manage playlists that I have seen yet.  Other than that the UI is familiar and it give Nokia City Lens Augmented reality is a cool concept, but I still haven’t seen it implemented in a compelling fashion beyond a demo at TED a couple of years ago.  The app still leaves me wanting as well.  It does give an interesting toy.  It gives you the ability to look for general categories and see general direction and clusters of locations.  I think as this concept is better thought out it will become more compelling. Nokia Trailers I don’t know how often I will use this app, but I do like being able to see what movies are being promoted.  I can’t wait for The Hobbit to come out and the trailer was just what the doctor ordered.  I can see coming back to this app from time to time. PhotoBeamer PhotoBeamer is a strange beast that needs a better instruction manual.  It seems a lot like magic but very confusing.  I need some more testing, but I don’t think this is something that most people are going to understand quickly and may give up before getting it to work.  I may put an update here after playing with it further. Ringtone Maker The app was just published and it didn’t work very well for me. It couldn’t find 95% of the songs that Nokia Music was playing for me and crashed several times.  It also had songs named wrong that when I checked them in Nokia Music they were fine.  This app looks like it has a long way to go. Summary In all I think that Nokia is offering a well rounded set of initial applications that can get any new owner started.  There is definitely room for improvement in all of these apps.  The main need is usability upgrades.  I would guess that with feedback from users they will come up to acceptable levels.  Try them out and see if you agree. del.icio.us Tags: Windows Phone,Nokia,Lumia,Nokia Apps,ESPN,PhotoBeamer,City Lens,YouSendIt,Drive,Maps

    Read the article

  • CodePlex Daily Summary for Wednesday, September 05, 2012

    CodePlex Daily Summary for Wednesday, September 05, 2012Popular ReleasesDesktop Google Reader: 1.4.6: Sorting feeds alphabetical is now optional (see preferences window)DotNetNuke® Community Edition CMS: 06.02.03: Major Highlights Fixed issue where mailto: links were not working when sending bulk email Fixed issue where uses did not see friendship relationships Problem is in 6.2, which does not show in the Versions Affected list above. Fixed the issue with cascade deletes in comments in CoreMessaging_Notification Fixed UI issue when using a date fields as a required profile property during user registration Fixed error when running the product in debug mode Fixed visibility issue when...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.65: Fixed null-reference error in the build task constructor.BLACK ORANGE: HPAD TEXT EDITOR 0.9 Beta: HOW TO RUN THE TEXT EDITOR Download the HPAD ARCHIVED FILES which is in .rar format Extract using Winrar Make sure that extracted files are in the same folder Double-Click on HPAD.exe application fileTelerikMvcGridCustomBindingHelper: Version 1.0.15.247-RC2: TelerikMvcGridCustomBindingHelper 1.0.15.247 RC2 Release notes: This is a RC version (hopefully the last one), please test and report any error or problem you encounter. This release is all about performance and fixes Support: "Or" and "Does Not contain" filter options Improved BooleanSubstitutes, Custom Aggregates and expressions-to-queryover Add EntityFramework examples in ExampleWebApplication Many other improvements and fixes Fix invalid cast on CustomAggregates Support for ...ServiceMon - Extensible Real-time, Service Monitoring Utility: ServiceMon Release 0.9.0.44: Auto-uploaded from build serverJavaScript Grid: Release 09-05-2012: Release 09-05-2012xUnit.net Contrib: xunitcontrib-dotCover 0.6.1 (dotCover 2.1 beta): xunitcontrib release 0.6.1 for dotCover 2.1 beta This release provides a test runner plugin for dotCover 2.1 beta, targetting all versions of xUnit.net. (See the xUnit.net project to download xUnit.net itself.) This release adds support for running xUnit.net tests to dotCover 2.1 beta's Visual Studio plugin. PLEASE NOTE: You do NOT need this if you also have ReSharper and the existing 0.6.1 release installed. DotCover will use ReSharper's test runners if available. This release includes th...B INI Sharp Library: B INI Sharp Library v1.0.0.0 Realsed: The frist realsedActive Forums for DotNetNuke CMS: Active Forums 5.0.0 RC: RC release of Active Forums 5.0.Droid Explorer: Droid Explorer 0.8.8.7 Beta: Bug in the display icon for apk's, will fix with next release Added fallback icon if unable to get the image/icon from the Cloud Service Removed some stale plugins that were either out dated or incomplete. Added handler for *.ab files for restoring backups Added plugin to create device backups Backups stored in %USERPROFILE%\Android Backups\%DEVICE_ID%\ Added custom folder icon for the android backups directory better error handling for installing an apk bug fixes for the Runn...BI System Monitor: v2.1: Data Audits report and supporting SQL, and SSIS package Environment Overview report enhancements, improving the appearance, addition of data audit finding indicators Note: SQL 2012 version coming soon.The Visual Guide for Building Team Foundation Server 2012 Environments: Version 1: --Nearforums - ASP.NET MVC forum engine: Nearforums v8.5: Version 8.5 of Nearforums, the ASP.NET MVC Forum Engine. New features include: Built-in search engine using Lucene.NET Flood control improvements Notifications improvements: sync option and mail body View Roadmap for more details webdeploy package sha1 checksum: 961aff884a9187b6e8a86d68913cdd31f8deaf83WiX Toolset: WiX Toolset v3.6: WiX Toolset v3.6 introduces the Burn bootstrapper/chaining engine and support for Visual Studio 2012 and .NET Framework 4.5. Other minor functionality includes: WixDependencyExtension supports dependency checking among MSI packages. WixFirewallExtension supports more features of Windows Firewall. WixTagExtension supports Software Id Tagging. WixUtilExtension now supports recursive directory deletion. Melt simplifies pure-WiX patching by extracting .msi package content and updating .w...Iveely Search Engine: Iveely Search Engine (0.2.0): ????ISE?0.1.0??,?????,ISE?0.2.0?????????,???????,????????20???follow?ISE,????,??ISE??????????,??????????,?????????,?????????0.2.0??????,??????????。 Iveely Search Engine ?0.2.0?????????“??????????”,??????,?????????,???????,???????????????????,????、????????????。???0.1.0????????????: 1. ??“????” ??。??????????,?????????,???????????????????。??:????????,????????????,??????????????????。??????。 2. ??“????”??。?0.1.0??????,???????,???????????????,?????????????,????????,?0.2.0?,???????...GmailDefaultMaker: GmailDefaultMaker 3.0.0.2: Add QQ Mail BugfixSmart Data Access layer: Smart Data access Layer Ver 3: In this version support executing inline query is added. Check Documentation section for detail.DotNetNuke® Form and List: 06.00.04: DotNetNuke Form and List 06.00.04 Don't forget to backup your installation before upgrade. Changes in 06.00.04 Fix: Sql Scripts for 6.003 missed object qualifiers within stored procedures Fix: added missing resource "cmdCancel.Text" in form.ascx.resx Changes in 06.00.03 Fix: MakeThumbnail was broken if the application pool was configured to .Net 4 Change: Data is now stored in nvarchar(max) instead of ntext Changes in 06.00.02 The scripts are now compatible with SQL Azure, tested in a ne...Coevery - Free CRM: Coevery 1.0.0.24: Add a sample database, and installation instructions.New ProjectsA Simple Eng-Hindi CMS: A simple English- Hindi dual language content management system for small business/personal websites.Active Social Migrator: This project for managing the Active Social migration tool.ANSI Console User Control: Custom console control for .NET WinformsAutoSPInstallerGUI: GUI Configuration Tool for SPAutoInstaller Codeplex ProjectCode Documentation Checkin Policy: This checkin policy for Visual Studio 2012 checks if c# code is documented the way it's configured in the config of the policy. Code Dojo/Kata - Free Time Coding: Doing some katas of the Coding Dojo page. http://codingdojo.org/cgi-bin/wiki.pl?KataCataloguefjycUnifyShow: fjycUnifyShowHidden Capture (HC): HC is simple and easy utility to hidden and auto capture desktop or active windowHRC Integration Services: Fake SQL Server Integration Services. LOLKooboo CMS Sites Switcher: Kooboo CMS Sites SwitcherMod.CookieDetector: Orchard module for detecting whether cookies are enabledMyCodes: Created!MySQL Statement Monitor: MySQL Statement Monitor is a monitoring tool that monitors SQL statements transferred over the network.NeoModulusPIRandom: The idea with PI Random is to use easy string manipulation and simple math to generate a pseudo random number. Net Core Tech - Medical Record System: This is a Medical Record System ProjectOraPowerShell: PowerShell library for backup and maintenance of a Oracle Database environment under Microsoft Windows 2008PinDNN: PinDNN is a module that imparts Pinterest-like functionality to DotNetNuke sites. This module works with a MongoDB database and uses the built-in social relatioPyrogen Code Generator: PyroGen is a simple code generator accepting C# as the markup language.restMs: wil be deleted soonScript.NET: Script.NET is a script management utility for web forms and MVC, using ScriptJS-like features to link dependencies between scripts.SpringExample-Pagination: Simple Spring example with PaginationXNA and Component Based Design: This project includes code for XNA and Component Based Design

    Read the article

  • JPA 2.1 Schema Generation (TOTD #187)

    - by arungupta
    This blog explained some of the key features of JPA 2.1 earlier. Since then Schema Generation has been added to JPA 2.1. This Tip Of The Day (TOTD) will provide more details about this new feature in JPA 2.1. Schema Generation refers to generation of database artifacts like tables, indexes, and constraints in a database schema. It may or may not involve generation of a proper database schema depending upon the credentials and authorization of the user. This helps in prototyping of your application where the required artifacts are generated either prior to application deployment or as part of EntityManagerFactory creation. This is also useful in environments that require provisioning database on demand, e.g. in a cloud. This feature will allow your JPA domain object model to be directly generated in a database. The generated schema may need to be tuned for actual production environment. This usecase is supported by allowing the schema generation to occur into DDL scripts which can then be further tuned by a DBA. The following set of properties in persistence.xml or specified during EntityManagerFactory creation controls the behaviour of schema generation. Property Name Purpose Values javax.persistence.schema-generation-action Controls action to be taken by persistence provider "none", "create", "drop-and-create", "drop" javax.persistence.schema-generation-target Controls whehter schema to be created in database, whether DDL scripts are to be created, or both "database", "scripts", "database-and-scripts" javax.persistence.ddl-create-script-target, javax.persistence.ddl-drop-script-target Controls target locations for writing of scripts. Writers are pre-configured for the persistence provider. Need to be specified only if scripts are to be generated. java.io.Writer (e.g. MyWriter.class) or URL strings javax.persistence.ddl-create-script-source, javax.persistence.ddl-drop-script-source Specifies locations from which DDL scripts are to be read. Readers are pre-configured for the persistence provider. java.io.Reader (e.g. MyReader.class) or URL strings javax.persistence.sql-load-script-source Specifies location of SQL bulk load script. java.io.Reader (e.g. MyReader.class) or URL string javax.persistence.schema-generation-connection JDBC connection to be used for schema generation javax.persistence.database-product-name, javax.persistence.database-major-version, javax.persistence.database-minor-version Needed if scripts are to be generated and no connection to target database. Values are those obtained from JDBC DatabaseMetaData. javax.persistence.create-database-schemas Whether Persistence Provider need to create schema in addition to creating database objects such as tables, sequences, constraints, etc. "true", "false" Section 11.2 in the JPA 2.1 specification defines the annotations used for schema generation process. For example, @Table, @Column, @CollectionTable, @JoinTable, @JoinColumn, are used to define the generated schema. Several layers of defaulting may be involved. For example, the table name is defaulted from entity name and entity name (which can be specified explicitly as well) is defaulted from the class name. However annotations may be used to override or customize the values. The following entity class: @Entity public class Employee {    @Id private int id;    private String name;     . . .     @ManyToOne     private Department dept; } is generated in the database with the following attributes: Maps to EMPLOYEE table in default schema "id" field is mapped to ID column as primary key "name" is mapped to NAME column with a default VARCHAR(255). The length of this field can be easily tuned using @Column. @ManyToOne is mapped to DEPT_ID foreign key column. Can be customized using JOIN_COLUMN. In addition to these properties, couple of new annotations are added to JPA 2.1: @Index - An index for the primary key is generated by default in a database. This new annotation will allow to define additional indexes, over a single or multiple columns, for a better performance. This is specified as part of @Table, @SecondaryTable, @CollectionTable, @JoinTable, and @TableGenerator. For example: @Table(indexes = {@Index(columnList="NAME"), @Index(columnList="DEPT_ID DESC")})@Entity public class Employee {    . . .} The generated table will have a default index on the primary key. In addition, two new indexes are defined on the NAME column (default ascending) and the foreign key that maps to the department in descending order. @ForeignKey - It is used to define foreign key constraint or to otherwise override or disable the persistence provider's default foreign key definition. Can be specified as part of JoinColumn(s), MapKeyJoinColumn(s), PrimaryKeyJoinColumn(s). For example: @Entity public class Employee {    @Id private int id;    private String name;    @ManyToOne    @JoinColumn(foreignKey=@ForeignKey(foreignKeyDefinition="FOREIGN KEY (MANAGER_ID) REFERENCES MANAGER"))    private Manager manager;     . . . } In this entity, the employee's manager is mapped by MANAGER_ID column in the MANAGER table. The value of foreignKeyDefinition would be a database specific string. A complete replay of Linda's talk at JavaOne 2012 can be seen here (click on CON4212_mp4_4212_001 in Media). These features will be available in GlassFish 4 promoted builds in the near future. JPA 2.1 will be delivered as part of Java EE 7. The different components in the Java EE 7 platform are tracked here. JPA 2.1 Expert Group has released Early Draft 2 of the specification. Section 9.4 and 11.2 provide all details about Schema Generation. The latest javadocs can be obtained from here. And the JPA EG would appreciate feedback.

    Read the article

  • "Oracle ?????????" Oracle Days Tokyo 2012 ?????

    - by OTN-J Master
    Normal 0 0 2 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0mm 5.4pt 0mm 5.4pt; mso-para-margin:0mm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.5pt; mso-bidi-font-size:11.0pt; font-family:"Century","serif"; mso-ascii-font-family:Century; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"MS ??"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Century; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-font-kerning:1.0pt;} Normal 0 0 2 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0mm 5.4pt 0mm 5.4pt; mso-para-margin:0mm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.5pt; mso-bidi-font-size:11.0pt; font-family:"Century","serif"; mso-ascii-font-family:Century; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"MS ??"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Century; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-font-kerning:1.0pt;} 10?30?(?)?31?(?)?2????Oracle Days Tokyo 2012?????????????????????Oracle Days ???????????????????????????????????????????????? ??????????????????IT???????????????(Simplified IT, Unleash Innovation)????IT????????????????????????????????????????????????????9/30??10/4???????????????Oracle OpenWorld 2012 ??????????????????????????????????????????????????????????????????? Normal 0 0 2 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0mm 5.4pt 0mm 5.4pt; mso-para-margin:0mm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.5pt; mso-bidi-font-size:11.0pt; font-family:"Century","serif"; mso-ascii-font-family:Century; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"MS ??"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Century; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-font-kerning:1.0pt;} Oracle Days Tokyo 2012?????????????????????????????13?????60????????????????????????????????? ?1??:??????????????????? 1????????????????????????????????????????????·????????????????·??????????????????????????????????????????????????????????????????????????????????????????????????????? ????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????·????????????????????????????????????????????????????????????????????????????????????????????Exadata???????????????????????????????????? Normal 0 0 2 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0mm 5.4pt 0mm 5.4pt; mso-para-margin:0mm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Century","serif";} ?2??:???????????·??????????????????????? 2????????????????????????????????????????????????????????????????????????????·??????????????·???????????????????Oracle Cloud???????? ??????????6????????????? ·????????:???????·?????????????????·????????????????·??????????3??????????????WebLogic Server 12?Oracle Fusion Middleware 11g?Oracle Exalogic?Oracle Event Processing?Oracle Coherence?Oracle Tuxedo ART 12c?Java??? ·????·???????:?????????·??????????????&????????2???????Oracle ?????????????????????????????????????????????? ·??????????:?????·??????????1?????13??????????????????????·??????????????????????????? Normal 0 0 2 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0mm 5.4pt 0mm 5.4pt; mso-para-margin:0mm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Century","serif";} ¦??????????????????? ?????????????????????????????????????????????????????????????????????????????????????????????????????????????????????? ¦???????????????? Oracle Days Tokyo 2012???????????????????????????????????????????????????????????????????????????????????????????????????????????? ???????????????????? ???????????????????

    Read the article

  • Parse JSON in C#

    - by Ender
    I'm trying to parse some JSON data from the Google AJAX Search API. I have this URL and I'd like to break it down so that the results are displayed. I've currently written this code, but I'm pretty lost in regards of what to do next, although there are a number of examples out there with simplified JSON strings. Being new to C# and .NET in general I've struggled to get a genuine text output for my ASP.NET page so I've been recommended to give JSON.NET a try. Could anyone point me in the right direction to just simply writing some code that'll take in JSON from the Google AJAX Search API and print it out to the screen? EDIT: I think I've made some progress in regards to getting some code working using DataContractJsonSerializer. Here is the code I have so far. Any advice on whether this is close to working and/or how I would output my results in a clean format? EDIT 2: I've followed the advice from Dreas Grech and the StackOverflowException has gone. However, now I am getting no output. Any ideas on where to go next? EDIT 3: ALL FIXED! All results are working fine. Thank you again Dreas Grech! using System; using System.Data; using System.Configuration; using System.Web; using System.Web.Security; using System.Web.UI; using System.Web.UI.WebControls; using System.Web.UI.WebControls.WebParts; using System.Web.UI.HtmlControls; using System.ServiceModel.Web; using System.Runtime.Serialization; using System.Runtime.Serialization.Json; using System.IO; using System.Text; public partial class _Default : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { GoogleSearchResults g1 = new GoogleSearchResults(); const string json = @"{""responseData"": {""results"":[{""GsearchResultClass"":""GwebSearch"",""unescapedUrl"":""http://www.cheese.com/"",""url"":""http://www.cheese.com/"",""visibleUrl"":""www.cheese.com"",""cacheUrl"":""http://www.google.com/search?q\u003dcache:bkg1gwNt8u4J:www.cheese.com"",""title"":""\u003cb\u003eCHEESE\u003c/b\u003e.COM - All about \u003cb\u003echeese\u003c/b\u003e!."",""titleNoFormatting"":""CHEESE.COM - All about cheese!."",""content"":""\u003cb\u003eCheese\u003c/b\u003e - everything you want to know about it. Search \u003cb\u003echeese\u003c/b\u003e by name, by types of milk, by textures and by countries.""},{""GsearchResultClass"":""GwebSearch"",""unescapedUrl"":""http://en.wikipedia.org/wiki/Cheese"",""url"":""http://en.wikipedia.org/wiki/Cheese"",""visibleUrl"":""en.wikipedia.org"",""cacheUrl"":""http://www.google.com/search?q\u003dcache:n9icdgMlCXIJ:en.wikipedia.org"",""title"":""\u003cb\u003eCheese\u003c/b\u003e - Wikipedia, the free encyclopedia"",""titleNoFormatting"":""Cheese - Wikipedia, the free encyclopedia"",""content"":""\u003cb\u003eCheese\u003c/b\u003e is a food consisting of proteins and fat from milk, usually the milk of cows, buffalo, goats, or sheep. It is produced by coagulation of the milk \u003cb\u003e...\u003c/b\u003e""},{""GsearchResultClass"":""GwebSearch"",""unescapedUrl"":""http://www.ilovecheese.com/"",""url"":""http://www.ilovecheese.com/"",""visibleUrl"":""www.ilovecheese.com"",""cacheUrl"":""http://www.google.com/search?q\u003dcache:GBhRR8ytMhQJ:www.ilovecheese.com"",""title"":""I Love \u003cb\u003eCheese\u003c/b\u003e!, Homepage"",""titleNoFormatting"":""I Love Cheese!, Homepage"",""content"":""The American Dairy Association\u0026#39;s official site includes recipes and information on nutrition and storage of \u003cb\u003echeese\u003c/b\u003e.""},{""GsearchResultClass"":""GwebSearch"",""unescapedUrl"":""http://www.gnome.org/projects/cheese/"",""url"":""http://www.g

    Read the article

  • How to use BCDEdit to dual boot Windows installations?

    - by Ian Boyd
    What are the bcdedit commands necessary to setup dual boot between different installations of Windows?5 Background i recently installed Windows 8 onto a separate hard drive1. Now that Windows 8 in installed i want to dual-boot back to Windows 7. i have my two2 hard drives: So you can see that i have my two disks, with the partitions containing Windows: Windows 7: \\PhysicalDisk0 (partition 03) Windows 8: \\PhysicalDisk2 (partition 1) What i'm trying to figure out how is how to use bcdedit to instruct the thing that boots Windows that there is another Windows installation out there. Running bcdedit now, it shows current configuration: C:\WINDOWS\system32>bcdedit Windows Boot Manager -------------------- identifier {bootmgr} device partition=\Device\HarddiskVolume2 description Windows Boot Manager locale en-US inherit {globalsettings} integrityservices Enable default {current} resumeobject {ce153eb7-3786-11e2-87c0-e740e123299f} displayorder {current} toolsdisplayorder {memdiag} timeout 30 Windows Boot Loader ------------------- identifier {current} device partition=C: path \WINDOWS\system32\winload.exe description Windows 8 locale en-US inherit {bootloadersettings} recoverysequence {ce153eb9-3786-11e2-87c0-e740e123299f} integrityservices Enable recoveryenabled Yes allowedinmemorysettings 0x15000075 osdevice partition=C: systemroot \WINDOWS resumeobject {ce153eb7-3786-11e2-87c0-e740e123299f} nx OptIn bootmenupolicy Standard hypervisorlaunchtype Auto i cannot find any documentation on the difference between Windows Boot Manager and Windows Boot Loader. Documentation There is some documentation on Bcdedit: Technet: Command Line Reference - Bcdedit Technet: Windows Automated Installation Kit - BCDEdit Command Line Options Whitepaper - BCDEdit Commands for Boot Environment (Word Document) But they don't explain how edit the binary boot configuration data If i had to guess, i would think that a Windows Boot Manager instructs the BIOS what program it should run. That program would give the user a set of boot choices. That leaves Windows Boot Loader do be a particular boot choice, that represents a particular installation of Windows. If that is the case i would need to create a new Windows Boot Loader entry. This means i might want to use the /create parameter: /create Creates a new boot entry: bcdedit [/store filename] /create [id] /d description [/application apptype | /inherit [apptype] | /inherit DEVICE | /device] So i assume a syntax of: >bcdedit /create /d "The old Windows 7" /application osloader Where application can be one of the following types: Apptype Description BOOTSECTOR The boot sector application OSLOADER The Windows boot loader RESUME A resume application Unfortunately, the only documentation about osloader is "The Windows boot loader". i don't see how that can differentiate between Windows 8 on one hard drive, and Windows 7 on another. The other possible parameter when /create a boot loader is >bcdedit /create /D "Windows Vista" /device "The Quick Brown Fox" Unfortunately the documentation is missing for /device: /device Optional. If id is not set to a well-known identifier, the option that is used to specify the new boot entry as an additional device options entry. Since i did not set id to a well-known identifier, i must set /device to "the option that is used to specify the new boot entry as an additional device options entry". i know all those words; they're all English. But i have on idea what it is saying; those words in that order seem nonsensical. So i'm somewhat stymied. i don't want to be like Dan Stolts from Microsoft: I found no content that was particularly helpful when I hosed my machine by playing with BCDEdit. This post would have been ok if there was much more detail especially on the /set command OSDevice, etc. So once I got my machine fixed, I documented the solution and the information is here.... i mean, if a Microsoft guy can't even figure out how to use BCDEdit to edit his BCD, then what chance to i have? Bonus Reading BCDEdit Command-Line Options Bcdedit Server 2008 R2 or Windows 7 System Will NOT Boot After Making Changes To Boot Manager Using BCDEdit Visual BCD Editor4 Windows 7 and Windows 8 RTM Dual Boot Setup Footnotes 1 Since the Windows 8 installer would have damaged my Windows 7 install, i decided to unplug my "main" hard drive during the install. Which is a long-winded explanation of why the Windows 8 installer didn't detect the existing Windows 7 install. Normally the installer would have automatically created the required entries for dual-boot. Not that the reason i'm asking the question is important. 2 Really there's three drives, but the third is just bulk storage. The existence of a 3rd hard drive is irrelevant to the question. i only mention it in case someone wants to know why the screenshot has 3 hard drives when i only mention two. 3 i arbitrarily started numbering partitions at "zero"; not to imply that partitions are numbered starting at zero. i only mention partitions because i don't see how any boot-loader could do its job without knowing which partition, and which folder, an installation of Windows is located in. 4 i'm asking about BCDEdit. i tried Visual BCD Editor. It seems to be a visual BCD editor. That is to say that it's a GUI, but still uses the same terminology as BCDEdit, and requires the same knowledge that BCD doesn't document. 5 For simplicity sake we'll assume that all installation of Windows i want to dual-boot between are Windows Vista or later, making them all compatible with the BCDEdit and the binary boot loader. The alternative would require delving into the intricacies of the old ntloader. Nor am i asking about dual booting to Linux; or how to boot to a Virtual Hard Drive (vhd) image. Just modern versions of Windows on existing hard drives in the same machine. Note: You can ignore everything after the word Background. It's all pointless exposition to satisfy some people's need for "research effort" before they'll consider being helpful. Some people have even been known to summarily close questions unless there is research effort. Some people have been know to close questions if there is too much research effort. Some people close questions when i put the note saying that they can ignore everything after the Background out of spite. Some people are just grumpy.

    Read the article

  • Loading PNGs into OpenGL performance issues - Java & JOGL much slower than C# & Tao.OpenGL

    - by Edward Cresswell
    I am noticing a large performance difference between Java & JOGL and C# & Tao.OpenGL when both loading PNGs from storage into memory, and when loading that BufferedImage (java) or Bitmap (C# - both are PNGs on hard drive) 'into' OpenGL. This difference is quite large, so I assumed I was doing something wrong, however after quite a lot of searching and trying different loading techniques I've been unable to reduce this difference. With Java I get an image loaded in 248ms and loaded into OpenGL in 728ms The same on C# takes 54ms to load the image, and 34ms to load/create texture. The image in question above is a PNG containing transparency, sized 7200x255, used for a 2D animated sprite. I realise the size is really quite ridiculous and am considering cutting up the sprite, however the large difference is still there (and confusing). On the Java side the code looks like this: BufferedImage image = ImageIO.read(new File(fileName)); texture = TextureIO.newTexture(image, false); texture.setTexParameteri(GL.GL_TEXTURE_MIN_FILTER, GL.GL_LINEAR); texture.setTexParameteri(GL.GL_TEXTURE_MAG_FILTER, GL.GL_LINEAR); The C# code uses: Bitmap t = new Bitmap(fileName); t.RotateFlip(RotateFlipType.RotateNoneFlipY); Rectangle r = new Rectangle(0, 0, t.Width, t.Height); BitmapData bd = t.LockBits(r, ImageLockMode.ReadOnly, PixelFormat.Format32bppArgb); Gl.glBindTexture(Gl.GL_TEXTURE_2D, tID); Gl.glTexImage2D(Gl.GL_TEXTURE_2D, 0, Gl.GL_RGBA, t.Width, t.Height, 0, Gl.GL_BGRA, Gl.GL_UNSIGNED_BYTE, bd.Scan0); Gl.glTexParameteri(Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_MIN_FILTER, Gl.GL_LINEAR); Gl.glTexParameteri(Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_MAG_FILTER, Gl.GL_LINEAR); t.UnlockBits(bd); t.Dispose(); After quite a lot of testing I can only come to the conclusion that Java/JOGL is just slower here - PNG reading might not be as quick, or that I'm still doing something wrong. Thanks. Edit2: I have found that creating a new BufferedImage with format TYPE_INT_ARGB_PRE decreases OpenGL texture load time by almost half - this includes having to create the new BufferedImage, getting the Graphics2D from it and then rendering the previously loaded image to it. Edit3: Benchmark results for 5 variations. I wrote a small benchmarking tool, the following results come from loading a set of 33 pngs, most are very wide, 5 times. testStart: ImageIO.read(file) -> TextureIO.newTexture(image) result: avg = 10250ms, total = 51251 testStart: ImageIO.read(bis) -> TextureIO.newTexture(image) result: avg = 10029ms, total = 50147 testStart: ImageIO.read(file) -> TextureIO.newTexture(argbImage) result: avg = 5343ms, total = 26717 testStart: ImageIO.read(bis) -> TextureIO.newTexture(argbImage) result: avg = 5534ms, total = 27673 testStart: TextureIO.newTexture(file) result: avg = 10395ms, total = 51979 ImageIO.read(bis) refers to the technique described in James Branigan's answer below. argbImage refers to the technique described in my previous edit: img = ImageIO.read(file); argbImg = new BufferedImage(img.getWidth(), img.getHeight(), TYPE_INT_ARGB_PRE); g = argbImg.createGraphics(); g.drawImage(img, 0, 0, null); texture = TextureIO.newTexture(argbImg, false); Any more methods of loading (either images from file, or images to OpenGL) would be appreciated, I will update these benchmarks.

    Read the article

  • Android: OutOfMemoryError while uploading video...

    - by AP257
    Hi all, I have the same problem as described here, but I will supply a few more details. While trying to upload a video in Android, I'm reading it into memory, and if the video is large I get an OutOfMemoryError. Here's my code: // get bytestream to upload videoByteArray = getBytesFromFile(cR, fileUriString); public static byte[] getBytesFromFile(ContentResolver cR, String fileUriString) throws IOException { Uri tempuri = Uri.parse(fileUriString); InputStream is = cR.openInputStream(tempuri); byte[] b3 = readBytes(is); is.close(); return b3; } public static byte[] readBytes(InputStream inputStream) throws IOException { ByteArrayOutputStream byteBuffer = new ByteArrayOutputStream(); // this is storage overwritten on each iteration with bytes int bufferSize = 1024; byte[] buffer = new byte[bufferSize]; int len = 0; while ((len = inputStream.read(buffer)) != -1) { byteBuffer.write(buffer, 0, len); } return byteBuffer.toByteArray(); } And here's the traceback (the error is thrown on the byteBuffer.write(buffer, 0, len) line): 04-08 11:56:20.456: ERROR/dalvikvm-heap(6088): Out of memory on a 16775184-byte allocation. 04-08 11:56:20.456: INFO/dalvikvm(6088): "IntentService[UploadService]" prio=5 tid=17 RUNNABLE 04-08 11:56:20.456: INFO/dalvikvm(6088): | group="main" sCount=0 dsCount=0 s=N obj=0x449a3cf0 self=0x38d410 04-08 11:56:20.456: INFO/dalvikvm(6088): | sysTid=6119 nice=0 sched=0/0 cgrp=default handle=4010416 04-08 11:56:20.456: INFO/dalvikvm(6088): at java.io.ByteArrayOutputStream.expand(ByteArrayOutputStream.java:~93) 04-08 11:56:20.456: INFO/dalvikvm(6088): at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:218) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.readBytes(UploadService.java:199) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.getBytesFromFile(UploadService.java:182) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.doUploadinBackground(UploadService.java:118) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.onHandleIntent(UploadService.java:85) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.app.IntentService$ServiceHandler.handleMessage(IntentService.java:30) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.os.Handler.dispatchMessage(Handler.java:99) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.os.Looper.loop(Looper.java:123) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.os.HandlerThread.run(HandlerThread.java:60) 04-08 11:56:20.467: WARN/dalvikvm(6088): threadid=17: thread exiting with uncaught exception (group=0x4001b180) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): Uncaught handler: thread IntentService[UploadService] exiting due to uncaught exception 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): java.lang.OutOfMemoryError 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at java.io.ByteArrayOutputStream.expand(ByteArrayOutputStream.java:93) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:218) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.readBytes(UploadService.java:199) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.getBytesFromFile(UploadService.java:182) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.doUploadinBackground(UploadService.java:118) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.onHandleIntent(UploadService.java:85) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.app.IntentService$ServiceHandler.handleMessage(IntentService.java:30) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.os.Handler.dispatchMessage(Handler.java:99) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.os.Looper.loop(Looper.java:123) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.os.HandlerThread.run(HandlerThread.java:60) 04-08 11:56:20.496: INFO/Process(4657): Sending signal. PID: 6088 SIG: 3 I guess that as @DroidIn suggests, I need to upload it in chunks. But (newbie question alert) does that mean that I should make multiple PostMethod requests, and glue the file together at the server end? Or can I load the bytestream into memory in chunks, and glue it together in the Android code? If anyone could give me a clue as to the best approach, I would be very grateful.

    Read the article

  • Setting up a VPN connection to Amazon VPC - routing

    - by Keeno
    I am having some real issues setting up a VPN between out office and AWS VPC. The "tunnels" appear to be up, however I don't know if they are configured correctly. The device I am using is a Netgear VPN Firewall - FVS336GV2 If you see in the attached config downloaded from VPC (#3 Tunnel Interface Configuration), it gives me some "inside" addresses for the tunnel. When setting up the IPsec tunnels do I use the inside tunnel IP's (e.g. 169.254.254.2/30) or do I use my internal network subnet (10.1.1.0/24) I have tried both, when I tried the local network (10.1.1.x) the tracert stops at the router. When I tried with the "inside" ips, the tracert to the amazon VPC (10.0.0.x) goes out over the internet. this all leads me to the next question, for this router, how do I set up stage #4, the static next hop? What are these seemingly random "inside" addresses and where did amazon generate them from? 169.254.254.x seems odd? With a device like this, is the VPN behind the firewall? I have tweaked any IP addresses below so that they are not "real". I am fully aware, this is probably badly worded. Please if there is any further info/screenshots that will help, let me know. Amazon Web Services Virtual Private Cloud IPSec Tunnel #1 ================================================================================ #1: Internet Key Exchange Configuration Configure the IKE SA as follows - Authentication Method : Pre-Shared Key - Pre-Shared Key : --- - Authentication Algorithm : sha1 - Encryption Algorithm : aes-128-cbc - Lifetime : 28800 seconds - Phase 1 Negotiation Mode : main - Perfect Forward Secrecy : Diffie-Hellman Group 2 #2: IPSec Configuration Configure the IPSec SA as follows: - Protocol : esp - Authentication Algorithm : hmac-sha1-96 - Encryption Algorithm : aes-128-cbc - Lifetime : 3600 seconds - Mode : tunnel - Perfect Forward Secrecy : Diffie-Hellman Group 2 IPSec Dead Peer Detection (DPD) will be enabled on the AWS Endpoint. We recommend configuring DPD on your endpoint as follows: - DPD Interval : 10 - DPD Retries : 3 IPSec ESP (Encapsulating Security Payload) inserts additional headers to transmit packets. These headers require additional space, which reduces the amount of space available to transmit application data. To limit the impact of this behavior, we recommend the following configuration on your Customer Gateway: - TCP MSS Adjustment : 1387 bytes - Clear Don't Fragment Bit : enabled - Fragmentation : Before encryption #3: Tunnel Interface Configuration Your Customer Gateway must be configured with a tunnel interface that is associated with the IPSec tunnel. All traffic transmitted to the tunnel interface is encrypted and transmitted to the Virtual Private Gateway. The Customer Gateway and Virtual Private Gateway each have two addresses that relate to this IPSec tunnel. Each contains an outside address, upon which encrypted traffic is exchanged. Each also contain an inside address associated with the tunnel interface. The Customer Gateway outside IP address was provided when the Customer Gateway was created. Changing the IP address requires the creation of a new Customer Gateway. The Customer Gateway inside IP address should be configured on your tunnel interface. Outside IP Addresses: - Customer Gateway : 217.33.22.33 - Virtual Private Gateway : 87.222.33.42 Inside IP Addresses - Customer Gateway : 169.254.254.2/30 - Virtual Private Gateway : 169.254.254.1/30 Configure your tunnel to fragment at the optimal size: - Tunnel interface MTU : 1436 bytes #4: Static Routing Configuration: To route traffic between your internal network and your VPC, you will need a static route added to your router. Static Route Configuration Options: - Next hop : 169.254.254.1 You should add static routes towards your internal network on the VGW. The VGW will then send traffic towards your internal network over the tunnels. IPSec Tunnel #2 ================================================================================ #1: Internet Key Exchange Configuration Configure the IKE SA as follows - Authentication Method : Pre-Shared Key - Pre-Shared Key : --- - Authentication Algorithm : sha1 - Encryption Algorithm : aes-128-cbc - Lifetime : 28800 seconds - Phase 1 Negotiation Mode : main - Perfect Forward Secrecy : Diffie-Hellman Group 2 #2: IPSec Configuration Configure the IPSec SA as follows: - Protocol : esp - Authentication Algorithm : hmac-sha1-96 - Encryption Algorithm : aes-128-cbc - Lifetime : 3600 seconds - Mode : tunnel - Perfect Forward Secrecy : Diffie-Hellman Group 2 IPSec Dead Peer Detection (DPD) will be enabled on the AWS Endpoint. We recommend configuring DPD on your endpoint as follows: - DPD Interval : 10 - DPD Retries : 3 IPSec ESP (Encapsulating Security Payload) inserts additional headers to transmit packets. These headers require additional space, which reduces the amount of space available to transmit application data. To limit the impact of this behavior, we recommend the following configuration on your Customer Gateway: - TCP MSS Adjustment : 1387 bytes - Clear Don't Fragment Bit : enabled - Fragmentation : Before encryption #3: Tunnel Interface Configuration Outside IP Addresses: - Customer Gateway : 217.33.22.33 - Virtual Private Gateway : 87.222.33.46 Inside IP Addresses - Customer Gateway : 169.254.254.6/30 - Virtual Private Gateway : 169.254.254.5/30 Configure your tunnel to fragment at the optimal size: - Tunnel interface MTU : 1436 bytes #4: Static Routing Configuration: Static Route Configuration Options: - Next hop : 169.254.254.5 You should add static routes towards your internal network on the VGW. The VGW will then send traffic towards your internal network over the tunnels. EDIT #1 After writing this post, I continued to fiddle and something started to work, just not very reliably. The local IPs to use when setting up the tunnels where indeed my network subnets. Which further confuses me over what these "inside" IP addresses are for. The problem is, results are not consistent what so ever. I can "sometimes" ping, I can "sometimes" RDP using the VPN. Sometimes, Tunnel 1 or Tunnel 2 can be up or down. When I came back into work today, Tunnel 1 was down, so I deleted it and re-created it from scratch. Now I cant ping anything, but Amazon AND the router are telling me tunnel 1/2 are fine. I guess the router/vpn hardware I have just isnt up to the job..... EDIT #2 Now Tunnel 1 is up, Tunnel 2 is down (I didn't change any settings) and I can ping/rdp again. EDIT #3 Screenshot of route table that the router has built up. Current state (tunnel 1 still up and going string, 2 is still down and wont re-connect)

    Read the article

  • Java: micro-optimizing array manipulation

    - by Martin Wiboe
    Hello all, I am trying to make a Java port of a simple feed-forward neural network. This obviously involves lots of numeric calculations, so I am trying to optimize my central loop as much as possible. The results should be correct within the limits of the float data type. My current code looks as follows (error handling & initialization removed): /** * Simple implementation of a feedforward neural network. The network supports * including a bias neuron with a constant output of 1.0 and weighted synapses * to hidden and output layers. * * @author Martin Wiboe */ public class FeedForwardNetwork { private final int outputNeurons; // No of neurons in output layer private final int inputNeurons; // No of neurons in input layer private int largestLayerNeurons; // No of neurons in largest layer private final int numberLayers; // No of layers private final int[] neuronCounts; // Neuron count in each layer, 0 is input // layer. private final float[][][] fWeights; // Weights between neurons. // fWeight[fromLayer][fromNeuron][toNeuron] // is the weight from fromNeuron in // fromLayer to toNeuron in layer // fromLayer+1. private float[][] neuronOutput; // Temporary storage of output from previous layer public float[] compute(float[] input) { // Copy input values to input layer output for (int i = 0; i < inputNeurons; i++) { neuronOutput[0][i] = input[i]; } // Loop through layers for (int layer = 1; layer < numberLayers; layer++) { // Loop over neurons in the layer and determine weighted input sum for (int neuron = 0; neuron < neuronCounts[layer]; neuron++) { // Bias neuron is the last neuron in the previous layer int biasNeuron = neuronCounts[layer - 1]; // Get weighted input from bias neuron - output is always 1.0 float activation = 1.0F * fWeights[layer - 1][biasNeuron][neuron]; // Get weighted inputs from rest of neurons in previous layer for (int inputNeuron = 0; inputNeuron < biasNeuron; inputNeuron++) { activation += neuronOutput[layer-1][inputNeuron] * fWeights[layer - 1][inputNeuron][neuron]; } // Store neuron output for next round of computation neuronOutput[layer][neuron] = sigmoid(activation); } } // Return output from network = output from last layer float[] result = new float[outputNeurons]; for (int i = 0; i < outputNeurons; i++) result[i] = neuronOutput[numberLayers - 1][i]; return result; } private final static float sigmoid(final float input) { return (float) (1.0F / (1.0F + Math.exp(-1.0F * input))); } } I am running the JVM with the -server option, and as of now my code is between 25% and 50% slower than similar C code. What can I do to improve this situation? Thank you, Martin Wiboe

    Read the article

  • How can I center XHTML content with CSS?

    - by drea
    so I recently converted a website of mine from a table content format to a div content format. Table format Version: Table version of the website: here. Table version style CSS: body { width: 1020px; margin: 0 auto; background-image: url(images/bg.png); } .logo{ width:301px; height:151px; background:url(images/logo.png); text-indent:-9999px; border:none; cursor:pointer; } .logo:hover { opacity:0.9; } .signin{ width:69px; height:30px; background:url(images/signin.png); text-indent:-9999px; border:none; cursor:pointer; } .signin:hover { opacity:0.9; } .register{ width:79px; height:30px; background:url(images/register.png); text-indent:-9999px; border:none; cursor:pointer; } .register:hover { opacity:0.9; } .Contact_Us{ width:53px; height:9px; background:url(images/Contact_Us.png); text-indent:-9999px; border:none; cursor:pointer; } .Contact_Us:hover { opacity:0.9; } .Code_of_Conduct{ width:84px; height:9px; background:url(images/Code_of_Conduct.png); text-indent:-9999px; border:none; cursor:pointer; } .Code_of_Conduct:hover { opacity:0.9; } .Privacy_Policy{ width:65px; height:12px; background:url(images/Privacy_Policy.png); text-indent:-9999px; border:none; cursor:pointer; } .Privacy_Policy:hover { opacity:0.9; } .Copyright{ width:149px; height:9px; background:url(images/Copyright.png); text-indent:-9999px; border:none; cursor:pointer; } .Copyright:hover { opacity:0.9; } .slideshow{ width:301px; height:151px; background: url(slideshow.png), url(minecraft.png), url(tf2.png), url(CSS.png), url(GM.png), url(aos.png), url(CSGO.png), url(voip.png), text-indent:-9999px; border:none; cursor:pointer; } .slideshow:hover { opacity:0.9; } Table version source: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"> <html> <head profile="http://www.w3.org/2005/10/profile"> <link rel="icon" type="image/png" href="http://www.xodusen.com/resources/images/favicon.png"> <title>Welcome to XodusEN</title> <script type="text/javascript" src="https://ajax.googleapis.com/ajax/libs/jquery/1.8.0/jquery.min.js"></script> <script type="text/javascript" src="http://cloud.github.com/downloads/malsup/cycle/jquery.cycle.all.latest.js"></script> <script type="text/javascript"> $(document).ready(function() { $('.slideshow').cycle({ fx: 'fade' // choose your transition type, ex: fade, scrollUp, shuffle, etc... }); }); </script> <meta name="description" content="This is the homepage of XodusEN. Xodus Entertainment Network is a unique & friendly Gaming Community that welcomes & realises the potential, and value within any user regardless of their origin. " > <meta name="keywords" content="XeN, Xodus, XEN, xen, Xodus Entertainment Network, gaming, community, PC, Steam, XBL, Xbox 360, PSN, Playstation, games, Gaming, Community, XodusEN, Gaming Network, Network, TF2, Server, CS:S, Minecraft, premium, servers, Counter-Strike: Source, Website, Homepage, Minecraftia" > <meta http-equiv="Content-Type" content="text/html; charset=utf-8"> <link href="style.css" rel="stylesheet" type="text/css"> <!--[if IE]> <script type="text/javascript"> window.location = "http://www.xodusen.com/ie/"; </script> <![endif]--> </head> <body bgcolor="#d7d7d7"> <table id="Table_01" border="0" cellpadding="0" cellspacing="0"> <tr> <td colspan="18"> <img src="images/index_01.png" width="1020" height="9" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="9" alt=""></td> </tr> <tr> <td colspan="11" rowspan="2"> <img src="images/index_02.png" width="826" height="252" alt=""></td> <td> <a id="signin" class="signin" href="http://s.xodusen.com/VrtqYm"> <img src="images/signin.png" width="69" height="30" border="0" alt=""></a> <td rowspan="6"> <img src="images/index_04.png" width="3" height="643" alt=""></td> <td colspan="3"> <a id="register" class="register" href="http://s.xodusen.com/WW3rpZ"> <img src="images/Register.png" width="79" height="30" border="0" alt=""></a> <td colspan="2" rowspan="6"> <img src="images/index_06.png" width="43" height="643" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="30" alt=""></td> </tr> <tr> <td rowspan="5"> <img src="images/index_07.png" width="69" height="613" alt=""></td> <td colspan="3" rowspan="5"> <img src="images/index_08.png" width="79" height="613" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="222" alt=""></td> </tr> <tr> <td colspan="5"> <img src="images/index_09.png" width="385" height="53" alt=""></td> <td> <img src="images/index_10.png" width="250" height="53" alt=""></td> <td colspan="5"> <img src="images/index_11.png" width="191" height="53" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="53" alt=""></td> </tr> <tr> <td colspan="4" rowspan="3"> <img src="images/index_09-13.png" width="360" height="338" alt=""></td> <td colspan="3"> <a id="logo" class="logo" href="http://www.xodusen.com/community"> <img src="images/logo.png" alt=""></a> </td> <td colspan="4" rowspan="3"> <img src="images/index_11-15.png" width="165" height="338" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="151" alt=""></td> </tr> <tr> <td rowspan="2"> <img src="images/index_09-16.png" width="25" height="187" alt=""></td> <td> <img src="images/index_16.png" width="250" height="46" alt=""></td> <td rowspan="2"> <img src="images/index_11-18.png" width="26" height="187" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="46" alt=""></td> </tr> <tr> <td> <img src="images/index_12.png" width="250" height="141" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="141" alt=""></td> </tr> <tr> <td rowspan="7"> <img src="images/index_13.png" width="27" height="548" alt=""></td> <td colspan="16" id="slideshow" class="slideshow"> <a href="http://www.xodusen.com/community"><img src="images/slideshow.png" width="960" height="305" alt=""></a> <a href="http://www.xodusen.com/mcurl"><img src="images/minecraft.png" width="960" height="305" alt=""></a> <a href="steam://connect/74.121.188.194:27015"><img src="images/tf2.png" width="960" height="305" alt=""></a> <a href="steam://connect/74.121.188.195:27015"><img src="images/CSS.png" width="960" height="305" alt=""></a> <a href="steam://connect/74.121.188.197:27015"><img src="images/GM.png" width="960" height="305" alt=""></a> <a href="aos://3267131722:32887"><img src="images/aos.png" width="960" height="305" alt=""></a> <a href="steam://connect/74.121.188.196:27015"><img src="images/CSGO.png" width="960" height="305" alt=""></a></td> <td rowspan="7"> <img src="images/index_15.png" width="33" height="548" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="305" alt=""></td> </tr> <tr> <td colspan="16"> <img src="images/index_16-23.png" width="960" height="155" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="155" alt=""></td> </tr> <tr> <td rowspan="5"> <img src="images/index_17.png" width="38" height="88" alt=""></td> <td rowspan="2"> <a id="Copyright" class="Copyright" href="http://www.xodusen.com/community"> <img src="images/Copyright.png" width="149" height="9" border="0" alt=""></a></td> <td colspan="14"> <img src="images/index_25.png" width="773" height="5" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="5" alt=""></td> </tr> <tr> <td colspan="5" rowspan="4"> <img src="images/index_20.png" width="527" height="83" alt=""></td> <td rowspan="3"> <a id="Privacy_Policy" class="Privacy_Policy" href="http://s.xodusen.com/VhGEkH"> <img src="images/Privacy_Policy.png" width="65" height="12" border="0" alt=""></a></td> <td rowspan="4"> <img src="images/index_28.png" width="8" height="83" alt=""></td> <td colspan="3" rowspan="2"> <a id="Code_of_Conduct" class="Code_of_Conduct" href="http://s.xodusen.com/Tf5Gz7"> <img src="images/Code_of_Conduct.png" width="84" height="9" border="0" alt=""></a></td> <td rowspan="4"> <img src="images/index_30.png" width="6" height="83" alt=""></td> <td rowspan="2"> <a id="Contact_Us" class="Contact_Us" href="http://s.xodusen.com/T5EYsG"> <img src="images/Contact_Us.png" width="53" height="9" border="0" alt=""></a></td> <td colspan="2" rowspan="4"> <img src="images/index_26.png" width="30" height="83" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="4" alt=""></td> </tr> <tr> <td rowspan="3"> <img src="images/index_27.png" width="149" height="79" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="5" alt=""></td> </tr> <tr> <td colspan="3" rowspan="2"> <img src="images/index_28-35.png" width="84" height="74" alt=""></td> <td rowspan="2"> <img src="images/index_29.png" width="53" height="74" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="3" alt=""></td> </tr> <tr> <td> <img src="images/index_30-37.png" width="65" height="71" alt=""></td> <td> <img src="images/spacer.gif" width="1" height="71" alt=""></td> </tr> <tr> <td> <img src="images/spacer.gif" width="27" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="38" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="149" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="146" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="25" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="250" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="26" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="80" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="65" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="8" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="12" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="69" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="3" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="6" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="53" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="20" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="10" height="1" alt=""></td> <td> <img src="images/spacer.gif" width="33" height="1" alt=""></td> <td></td> </tr> </table> </body> </html> Div format Version: Div version of the website: here. Div version style CSS: body { width: 1020px; margin: 0 auto; background-image: url(images/bg.png); } #Table_01 { position:absolute; left:0px; top:0px; width:1020px; height:1200px; } #index-01_ { position:absolute; left:0px; top:0px; width:1020px; height:9px; } #index-02_ { position:absolute; left:0px; top:9px; width:826px; height:305px; } #Signin_ { position:absolute; left:826px; top:9px; width:69px; height:30px; } #index-04_ { position:absolute; left:895px; top:9px; width:3px; height:643px; } #Register_ { position:absolute; left:898px; top:9px; width:79px; height:30px; } #index-06_ { position:absolute; left:977px; top:9px; width:43px; height:643px; } #index-07_ { position:absolute; left:826px; top:39px; width:69px; height:613px; } #index-08_ { position:absolute; left:898px; top:39px; width:79px; height:613px; } #index-09_ { position:absolute; left:0px; top:314px; width:360px; height:338px; } #Logo_ { position:absolute; left:360px; top:314px; width:301px; height:151px; } #index-11_ { position:absolute; left:661px; top:314px; width:165px; height:338px; } #index-12_ { position:absolute; left:360px; top:465px; width:301px; height:187px; } #index-13_ { position:absolute; left:0px; top:652px; width:27px; height:548px; } #Slideshow_ { position:absolute; left:27px; top:652px; width:960px; height:305px; } #index-15_ { position:absolute; left:987px; top:652px; width:33px; height:548px; } #index-16_ { position:absolute; left:27px; top:957px; width:960px; height:155px; } #index-17_ { position:absolute; left:27px; top:1112px; width:39px; height:88px; } #Copyright_ { position:absolute; left:66px; top:1112px; width:148px; height:13px; } #index-19_ { position:absolute; left:214px; top:1112px; width:773px; height:5px; } #index-20_ { position:absolute; left:214px; top:1117px; width:526px; height:83px; } #Privacy-Policy_ { position:absolute; left:740px; top:1117px; width:68px; height:23px; } #index-22_ { position:absolute; left:808px; top:1117px; width:6px; height:83px; } #Code-of-Conduct_ { position:absolute; left:814px; top:1117px; width:84px; height:23px; } #index-24_ { position:absolute; left:898px; top:1117px; width:2px; height:83px; } #Contact-Us_ { position:absolute; left:900px; top:1117px; width:57px; height:23px; } #index-26_ { position:absolute; left:957px; top:1117px; width:30px; height:83px; } #index-27_ { position:absolute; left:66px; top:1125px; width:148px; height:75px; } #index-28_ { position:absolute; left:740px; top:1140px; width:68px; height:60px; } #index-29_ { position:absolute; left:814px; top:1140px; width:84px; height:60px; } #index-30_ { position:absolute; left:900px; top:1140px; width:57px; height:60px; } .logo{ width:301px; height:151px; background:url(images/logo.png); text-indent:-9999px; border:none; cursor:pointer; } .logo:hover { opacity:0.9; } .signin{ width:69px; height:30px; background:url(images/signin.png); text-indent:-9999px; border:none; cursor:pointer; } .signin:hover { opacity:0.9; } .register{ width:79px; height:30px; background:url(images/register.png); text-indent:-9999px; border:none; cursor:pointer; } .register:hover { opacity:0.9; } .contact_Us{ width:53px; height:9px; background:url(images/Contact_Us.png); text-indent:-9999px; border:none; cursor:pointer; } .contact_Us:hover { opacity:0.9; } .code_of_Conduct{ width:84px; height:9px; background:url(images/Code_of_Conduct.png); text-indent:-9999px; border:none; cursor:pointer; } .code_of_Conduct:hover { opacity:0.9; } .privacy_policy{ width:65px; height:12px; background:url(images/Privacy_Policy.png); text-indent:-9999px; border:none; cursor:pointer; } .privacy_policy:hover { opacity:0.9; } .copyright{ width:148px; height:13px; background:url(images/Copyright.png); text-indent:-9999px; border:none; cursor:pointer; } .copyright:hover { opacity:0.9; } .slideshow{ width:301px; height:151px; background: url(slideshow.png), url(minecraft.png), url(tf2.png), url(CSS.png), url(GM.png), url(aos.png), url(CSGO.png), url(voip.png), text-indent:-9999px; border:none; cursor:pointer; } .slideshow:hover { opacity:0.9; } Div version source: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <script type="text/javascript"> //<![CDATA[ window.__CF=window.__CF||{};window.__CF.AJS={"vig_key":{"sid":"c6d1454039dd49b1c8400bbfdf74df7a"},"trumpet":{"message":"XodusEN is undergoing background maintenance, that will provide performance & graphical improvements to our system, but will not hinder your experience across our services."},"ga_key":{"ua":"UA-35779435-1","ga_bs":"2"},"exprmntly":{"service_id":"7967"},"cdnjs":{"__h":"1","cdnjs":"MO,GF,FX,CS,JS"},"abetterbrowser":{"ie":"10"}}; //]]> </script> <script type="text/javascript"> //<![CDATA[ try{if (!window.CloudFlare) { var CloudFlare=[{verbose:0,p:0,byc:0,owlid:"cf",mirage:{responsive:0,lazy:0},oracle:0,paths:{cloudflare:"/cdn-cgi/nexp/aav=1870252173/"},atok:"d6e39f49946fcb6d690f0d10d5a963f3",zone:"xodusen.com",rocket:"a",apps:{"vig_key":{"sid":"c6d1454039dd49b1c8400bbfdf74df7a"},"trumpet":{"message":"XodusEN is undergoing background maintenance, that will provide performance & graphical improvements to our system, but will not hinder your experience across our services."},"ga_key":{"ua":"UA-35779435-1","ga_bs":"2"},"exprmntly":{"service_id":"7967"},"cdnjs":{"__h":"1","cdnjs":"MO,GF,FX,CS,JS"},"abetterbrowser":{"ie":"10"}}}];document.write('<script type="text/javascript" src="//ajax.cloudflare.com/cdn-cgi/nexp/aav=4114775854/cloudflare.min.js"><'+'\/script>')}}catch(e){}; //]]> </script> <script type="text/javascript" src="//ajax.cloudflare.com/cdn-cgi/nexp/aav=1566821048/appsh.min.js"></script><script type="text/javascript">__CF.AJS.inith();</script><link rel="icon" type="image/png" href="http://www.xodusen.com/resources/images/favicon.png"> <title>Welcome to XodusEN</title> <meta name="description" content="This is the homepage of XodusEN. Xodus Entertainment Network is a unique & friendly Gaming Community that welcomes & realises the potential, and value within any user regardless of their origin. "> <meta name="keywords" content="XeN, Xodus, XEN, xen, Xodus Entertainment Network, gaming, community, PC, Steam, XBL, Xbox 360, PSN, Playstation, games, Gaming, Community, XodusEN, Gaming Network, Network, TF2, Server, CS:S, Minecraft, premium, servers, Counter-Strike: Source, Website, Homepage, Minecraftia"> <meta http-equiv="Content-Type" content="text/html; charset=utf-8"/> <link href="style.css" rel="stylesheet" type="text/css"> <script type="text/rocketscript" data-rocketsrc="https://ajax.googleapis.com/ajax/libs/jquery/1.8.0/jquery.min.js"></script> <script type="text/rocketscript" data-rocketsrc="http://cloud.github.com/downloads/malsup/cycle/jquery.cycle.all.latest.js"></script> <script type="text/rocketscript"> $(document).ready(function() { $('.slideshow').cycle({ fx: 'fade' // choose your transition type, ex: fade, scrollUp, shuffle, etc... }); }); </script> <!--[if IE]> <script type="text/javascript"> window.location = "http://www.xodusen.com/ie/"; </script> <![endif]--> <script type="text/javascript"> /* <![CDATA[ */ var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-35779435-1']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); (function(b){(function(a){"__CF"in b&&"DJS"in b.__CF?b.__CF.DJS.push(a):"addEventListener"in b?b.addEventListener("load",a,!1):b.attachEvent("onload",a)})(function(){"FB"in b&&"Event"in FB&&"subscribe"in FB.Event&&(FB.Event.subscribe("edge.create",function(a){_gaq.push(["_trackSocial","facebook","like",a])}),FB.Event.subscribe("edge.remove",function(a){_gaq.push(["_trackSocial","facebook","unlike",a])}),FB.Event.subscribe("message.send",function(a){_gaq.push(["_trackSocial","facebook","send",a])}));"twttr"in b&&"events"in twttr&&"bind"in twttr.events&&twttr.events.bind("tweet",function(a){if(a){var b;if(a.target&&a.target.nodeName=="IFRAME")a:{if(a=a.target.src){a=a.split("#")[0].match(/[^?=&]+=([^&]*)?/g);b=0;for(var c;c=a[b];++b)if(c.indexOf("url")===0){b=unescape(c.split("=")[1]);break a}}b=void 0}_gaq.push(["_trackSocial","twitter","tweet",b])}})})})(window); /* ]]> */ </script> <meta name="pinterest" content="nopin"/></head> <body style="background-color:#d7d7d7;"><script type="text/javascript"> //<![CDATA[ try{(function(a){var b="http://",c="www.xodusen.com",d="/cdn-cgi/cl/",e="618e40fe1e01787d9cb9aa2f8abc52caf8a32796.gif",f=new a;f.src=[b,c,d,e].join("")})(Image)}catch(e){} //]]> </script> <div id="Table_01"> <div id="index-01_"> <img id="index_01" src="images/index_01.png" width="1020" height="9" alt=""/> </div> <div id="index-02_"> <img id="index_02" src="images/index_02.png" width="826" height="305" alt=""/> </div> <div id="Signin_"> <a href="http://s.xodusen.com/VrtqYm"> <img id="Signin" class="signin" src="images/Signin.png" width="69" height="30" border="0" alt=""/></a> </div> <div id="index-04_"> <img id="index_04" src="images/index_04.png" width="3" height="643" alt=""/> </div> <div id="Register_"> <a href="http://s.xodusen.com/WW3rpZ"> <img id="Register" class="register" src="images/Register.png" width="79" height="30" alt=""/></a> </div> <div id="index-06_"> <img id="index_06" src="images/index_06.png" width="43" height="643" alt=""/> </div> <div id="index-07_"> <img id="index_07" src="images/index_07.png" width="69" height="613" alt=""/> </div> <div id="index-08_"> <img id="index_08" src="images/index_08.png" width="79" height="613" alt=""/> </div> <div id="index-09_"> <img id="index_09" src="images/index_09.png" width="360" height="338" alt=""/> </div> <div id="Logo_"> <a href="http://s.xodusen.com/WW3rpZ"> <img class="logo" src="images/Logo.png" width="301" height="151" alt=""></a> </div> <div id="index-11_"> <img id="index_11" src="images/index_11.png" width="165" height="338" alt=""/> </div> <div id="index-12_"> <img id="index_12" src="images/index_12.png" width="301" height="187" alt=""/> </div> <div id="index-13_"> <img id="index_13" src="images/index_13.png" width="27" height="548" alt=""/> </div> <div id="Slideshow_" class="slideshow"> <a href="http://www.xodusen.com/community"> <img src="images/slideshow.png" width="960" height="305" alt=""></a> <a href="http://www.xodusen.com/mcurl"> <img src="images/minecraft.png" width="960" height="305" alt=""></a> <a href="steam://connect/74.121.188.194:27015"> <img src="images/tf2.png" width="960" height="305" alt=""></a> <a href="steam://connect/74.121.188.195:27015"> <img src="images/CSS.png" width="960" height="305" alt=""></a> <a href="steam://connect/74.121.188.197:27015"> <img src="images/GM.png" width="960" height="305" alt=""></a> <a href="aos://3267131722:32887"> <img src="images/aos.png" width="960" height="305" alt=""></a> <a href="steam://connect/74.121.188.196:27015"> <img src="images/CSGO.png" width="960" height="305" alt=""></a> </div> <div id="index-15_"> <img id="index_15" src="images/index_15.png" width="33" height="548" alt=""/> </div> <div id="index-16_"> <img id="index_16" src="images/index_16.png" width="960" height="155" alt=""/> </div> <div id="index-17_"> <img id="index_17" src="images/index_17.png" width="39" height="88" alt=""/> </div> <div id="Copyright_"> <a href="http://www.xodusen.com/community"> <img id="Copyright" src="images/Copyright.png" width="148" height="13" alt=""></a> </div> <div id="index-19_"> <img id="index_19" src="images/index_19.png" width="773" height="5" alt=""/> </div> <div id="index-20_"> <img id="index_20" src="images/index_20.png" width="526" height="83" alt=""/> </div> <div id="Privacy-Policy_"> <a href="http://s.xodusen.com/VhGEkH"> <img id="Privacy_Policy" src="images/Privacy_Policy.png" width="68" height="23" alt=""></a> </div> <div id="index-22_"> <img id="index_22" src="images/index_22.png" width="6" height="83" alt=""/> </div> <div id="Code-of-Conduct_"> <a href="http://s.xodusen.com/Tf5Gz7"> <img id="Code_of_Conduct" src="images/Code_of_Conduct.png" width="84" height="23" alt=""></a> </div> <div id="index-24_"> <img id="index_24" src="images/index_24.png" width="2" height="83" alt=""/> </div> <div id="Contact-Us_"> <a href="http://s.xodusen.com/T5EYsG"> <img id="Contact_Us" src="images/Contact_Us.png" width="57" height="23" alt=""></a> </div> <div id="index-26_"> <img id="index_26" src="images/index_26.png" width="30" height="83" alt=""/> </div> <div id="index-27_"> <img id="index_27" src="images/index_27.png" width="148" height="75" alt=""/> </div> <div id="index-28_"> <img id="index_28" src="images/index_28.png" width="68" height="60" alt=""/> </div> <div id="index-29_"> <img id="index_29" src="images/index_29.png" width="84" height="60" alt=""/> </div> <div id="index-30_"> <img id="index_30" src="images/index_30.png" width="57" height="60" alt=""/> </div> </div> <script type="text/javascript" src="//ajax.cloudflare.com/cdn-cgi/nexp/aav=4188748942/apps1.min.js"></script><script type="text/javascript">__CF.AJS.init1();</script></body> </html> My issue is, how can I achieve the same 'centered' results in the div format of the website, as the table format of the website? I have done some research to no avail, so I'd thought given the reputation of this site, that i'd post my issue here. Thank you in advance, ~ drea.

    Read the article

  • postfix relaying all mail through office365 problems

    - by amrith
    This is a rather long question with a long list of things tried and travails so please bear with me. The summary is this. I am able to relay email from ubuntu through office365 using postfix; the configuration works. It only works as one of the users; more specifically the user who authenticates against office365 is the only valid "from" More details follow. I have a machine in Amazon's cloud on which I run a bunch of jobs and would like to have statuses mailed over to me. I use office365 at work so I want to relay mail through office365. I'm most familiar with postfix so I used that as the MTA. Configuration is ubuntu 12.04LTS; I've installed postfix and mail-utils. For this example, let me say my company is "company.com" and the machine in question (through an elastic IP and a DNS entry) is called "plaything.company.com". hostname is set to "plaything.company.com", so is /etc/mailname On plaything, I have the following users registered alpha, bravo, and charlie. I have the following configuration files. alias_database = hash:/etc/aliases alias_maps = hash:/etc/aliases append_dot_mydomain = no biff = no config_directory = /etc/postfix inet_interfaces = all inet_protocols = ipv4 mailbox_size_limit = 0 mydestination = plaything.company.com, localhost.company.com, , localhost myhostname = plaything.company.com mynetworks = 127.0.0.0/8 [::ffff:127.0.0.0]/104 [::1]/128 myorigin = /etc/mailname readme_directory = no recipient_delimiter = + relayhost = [smtp.office365.com]:587 sender_canonical_maps = hash:/etc/postfix/sender_canonical smtp_sasl_auth_enable = yes smtp_sasl_password_maps = hash:/etc/postfix/sasl_passwd smtp_sasl_security_options = noanonymous smtp_sasl_tls_security_options = noanonymous smtp_tls_CAfile = /etc/ssl/certs/ca-certificates.crt smtp_tls_session_cache_database = btree:${data_directory}/smtp_scache smtp_use_tls = yes smtpd_banner = $myhostname ESMTP $mail_name (Ubuntu) smtpd_tls_cert_file = /etc/ssl/certs/ssl-cert-snakeoil.pem smtpd_tls_key_file = /etc/ssl/private/ssl-cert-snakeoil.key smtpd_tls_session_cache_database = btree:${data_directory}/smtpd_scache smtpd_use_tls = yes As the machine is called plaything.company.com I went through the exercise of registering all the appropriate DNS entries to make office365 recognize that I owned plaything.company.com and allowed me to create a user called [email protected] in office365. In office365, I setup [email protected] as having another email address of [email protected]. Then, I made the following sender_canonical [email protected] [email protected] I created a sasl_passwd file that reads: smtp.office365.com [email protected]:123456password123456 let's just say that the password for [email protected] is 1234...456 With all this setup, login as alpha and mail [email protected] Cc: Subject: test test and the whole thing works wonderfully. email gets sent off by postfix, TLS works like a champ, authenticates as daemon@... and [email protected] in Office365 gets an email message. The issue comes up when logged in as bravo to the machine. sender is [email protected] and office365 says: status=bounced (host smtp.office365.com[132.245.12.25] said: 550 5.7.1 Client does not have permissions to send as this sender (in reply to end of DATA command)) this is because I'm trying to send mail as bravo@... and authenticating with office365 as daemon@.... The reason it works with alpha@... is because in office365, I setup [email protected] as having another email address of [email protected]. In Postfix Relay to Office365, Miles Erickson answers the question thusly: Don't send mail to Office365 as a user from your Office365-hosted e-mail domain. Use a subdomain instead, e.g. [email protected] instead of [email protected]. It wouldn't hurt to set up an SPF record for services.mydomain.com or whatever you decide to use. Don't authenticate against mail.messaging.microsoft.com as an Office365 user. Just connect on port 25 and deliver the mail to your domain as any foreign SMTP agent would do. OK, I've done #1, I have those records on DNS but for the most part they are not relevant once Office365 recognizes that I own the domain. Here are those records: CNAME records: - msoid.plaything.company.com - autodiscover.plaything.company.com MX record: - plaything.company.com (plaything-company-com.mail.protection.outlook.com) TXT record: - plaything.company.com (v=spf1 include:spf.protection.outlook.com -all) I've tried #2 but no matter what I do, office365 just blows away the connection with "not authenticated". I can try even a simple telnet to port 25 and attempt to send and it doesn't work. 250 BY2PR01CA007.outlook.office365.com Hello [54.221.245.236] 530 5.7.1 Client was not authenticated Connection closed by foreign host. Is there someone out there who has this kind of a configuration working where multiple users on a linux machine are able to relay mail using postfix through office365? There has to be someone out there doing this who can tell me what is wrong with my setup ...

    Read the article

  • installed openstack using devstack install shell script but getting 500 error when i try opening dashboard

    - by Arvind
    I followed the instructions at http://devstack.org/guides/single-machine.html to install OpenStack. I first installed Ubuntu on my Windows 7 PC using the officially supported Windows installer for Ubuntu 12.04 LTS. And after that I followed the instructions at above page to install OpenStack. As per instructions, I should be able to access the dashboard aka Horizon, at http://192.168.1.4/ (thats the IP of the PC on which I installed Ubuntu-OpenStack). However I am getting a 500 error web page when I open that. How do I resolve this error? I want to set up a dev environment for OpenStack. For your ref, the whole error message is given now-- FilterError at / /usr/bin/env: node: No such file or directory Request Method: GET Request URL: http://192.168.1.4/ Django Version: 1.4.2 Exception Type: FilterError Exception Value: /usr/bin/env: node: No such file or directory Exception Location: /usr/local/lib/python2.7/dist-packages/compressor/filters/base.py in input, line 133 Python Executable: /usr/bin/python Python Version: 2.7.3 Python Path: ['/opt/stack/horizon/openstack_dashboard/wsgi/../..', '/opt/stack/python-keystoneclient', '/opt/stack/python-novaclient', '/opt/stack/python-openstackclient', '/opt/stack/keystone', '/opt/stack/glance', '/opt/stack/python-glanceclient/setuptools_git-0.4.2-py2.7.egg', '/opt/stack/python-glanceclient', '/opt/stack/nova', '/opt/stack/horizon', '/opt/stack/cinder', '/opt/stack/python-cinderclient', '/usr/local/lib/python2.7/dist-packages', '/usr/lib/python2.7', '/usr/lib/python2.7/plat-linux2', '/usr/lib/python2.7/lib-tk', '/usr/lib/python2.7/lib-old', '/usr/lib/python2.7/lib-dynload', '/usr/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages/PIL', '/usr/lib/python2.7/dist-packages/gst-0.10', '/usr/lib/python2.7/dist-packages/gtk-2.0', '/usr/lib/pymodules/python2.7', '/usr/lib/python2.7/dist-packages/ubuntu-sso-client', '/usr/lib/python2.7/dist-packages/ubuntuone-client', '/usr/lib/python2.7/dist-packages/ubuntuone-control-panel', '/usr/lib/python2.7/dist-packages/ubuntuone-couch', '/usr/lib/python2.7/dist-packages/ubuntuone-storage-protocol', '/opt/stack/horizon/openstack_dashboard'] Server time: Sat, 27 Oct 2012 08:43:29 +0000 Error during template rendering In template /opt/stack/horizon/openstack_dashboard/templates/_stylesheets.html, error at line 3 /usr/bin/env: node: No such file or directory 1 {% load compress %} 2 3 {% compress css %} 4 <link href='{{ STATIC_URL }}dashboard/less/horizon.less' type='text/less' media='screen' rel='stylesheet' /> 5 {% endcompress %} 6 7 <link rel="shortcut icon" href="{{ STATIC_URL }}dashboard/img/favicon.ico"/> 8 Also, the traceback is now given below-- Environment: Request Method: GET Request URL: http://192.168.1.4/ Django Version: 1.4.2 Python Version: 2.7.3 Installed Applications: ('openstack_dashboard', 'django.contrib.contenttypes', 'django.contrib.auth', 'django.contrib.sessions', 'django.contrib.messages', 'django.contrib.staticfiles', 'django.contrib.humanize', 'compressor', 'horizon', 'openstack_dashboard.dashboards.project', 'openstack_dashboard.dashboards.admin', 'openstack_dashboard.dashboards.settings', 'openstack_auth') Installed Middleware: ('django.middleware.common.CommonMiddleware', 'django.middleware.csrf.CsrfViewMiddleware', 'django.contrib.sessions.middleware.SessionMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'horizon.middleware.HorizonMiddleware', 'django.middleware.doc.XViewMiddleware', 'django.middleware.locale.LocaleMiddleware') Template error: In template /opt/stack/horizon/openstack_dashboard/templates/_stylesheets.html, error at line 3 /usr/bin/env: node: No such file or directory 1 : {% load compress %} 2 : 3 : {% compress css %} 4 : <link href='{{ STATIC_URL }}dashboard/less/horizon.less' type='text/less' media='screen' rel='stylesheet' /> 5 : {% endcompress %} 6 : 7 : <link rel="shortcut icon" href="{{ STATIC_URL }}dashboard/img/favicon.ico"/> 8 : Traceback: File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/base.py" in get_response 111. response = callback(request, *callback_args, **callback_kwargs) File "/usr/local/lib/python2.7/dist-packages/django/views/decorators/vary.py" in inner_func 36. response = func(*args, **kwargs) File "/opt/stack/horizon/openstack_dashboard/wsgi/../../openstack_dashboard/views.py" in splash 38. return shortcuts.render(request, 'splash.html', {'form': form}) File "/usr/local/lib/python2.7/dist-packages/django/shortcuts/__init__.py" in render 44. return HttpResponse(loader.render_to_string(*args, **kwargs), File "/usr/local/lib/python2.7/dist-packages/django/template/loader.py" in render_to_string 176. return t.render(context_instance) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in render 140. return self._render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in _render 134. return self.nodelist.render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in render 823. bit = self.render_node(node, context) File "/usr/local/lib/python2.7/dist-packages/django/template/debug.py" in render_node 74. return node.render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/loader_tags.py" in render 155. return self.render_template(self.template, context) File "/usr/local/lib/python2.7/dist-packages/django/template/loader_tags.py" in render_template 137. output = template.render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in render 140. return self._render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in _render 134. return self.nodelist.render(context) File "/usr/local/lib/python2.7/dist-packages/django/template/base.py" in render 823. bit = self.render_node(node, context) File "/usr/local/lib/python2.7/dist-packages/django/template/debug.py" in render_node 74. return node.render(context) File "/usr/local/lib/python2.7/dist-packages/compressor/templatetags/compress.py" in render 147. return self.render_compressed(context, self.kind, self.mode, forced=forced) File "/usr/local/lib/python2.7/dist-packages/compressor/templatetags/compress.py" in render_compressed 107. rendered_output = self.render_output(compressor, mode, forced=forced) File "/usr/local/lib/python2.7/dist-packages/compressor/templatetags/compress.py" in render_output 119. return compressor.output(mode, forced=forced) File "/usr/local/lib/python2.7/dist-packages/compressor/css.py" in output 51. ret.append(subnode.output(*args, **kwargs)) File "/usr/local/lib/python2.7/dist-packages/compressor/css.py" in output 53. return super(CssCompressor, self).output(*args, **kwargs) File "/usr/local/lib/python2.7/dist-packages/compressor/base.py" in output 230. content = self.filter_input(forced) File "/usr/local/lib/python2.7/dist-packages/compressor/base.py" in filter_input 192. for hunk in self.hunks(forced): File "/usr/local/lib/python2.7/dist-packages/compressor/base.py" in hunks 167. precompiled, value = self.precompile(value, **options) File "/usr/local/lib/python2.7/dist-packages/compressor/base.py" in precompile 210. command=command, filename=filename).input(**kwargs) File "/usr/local/lib/python2.7/dist-packages/compressor/filters/base.py" in input 133. raise FilterError(err) Exception Type: FilterError at / Exception Value: /usr/bin/env: node: No such file or directory

    Read the article

  • Apache2 Segfault - need help interpreting this coredump (suspect cause is memcache / php session related)

    - by WayneDV
    Three Apache2 web servers running a PHP 5.2.3 web site. We're using Memcache to cache rendered pages but also as the storage engine of the PHP Sessions. At peak traffic times we're getting Apache segmentation faults on all three web servers and all HTTPD child processes segfault. My gut tells me that the increased Memcache traffic is stopping PHP sessions from being created or cleaned up and thus the processes die. Is it possible for someone to confirm that from the following? : #0 _zend_mm_free_int (heap=0x7fb67a075820, p=0x7fb67a011538) at /usr/src/debug/php-5.3.3/Zend/zend_alloc.c:2018 #1 0x00007fb665d02e82 in mmc_buffer_free (request=0x7fb67a011548) at /usr/src/debug/php-pecl-memcache-3.0.4/memcache-3.0.4/memcache_pool.c:50 #2 mmc_request_free (request=0x7fb67a011548) at /usr/src/debug/php-pecl-memcache-3.0.4/memcache-3.0.4/memcache_pool.c:169 #3 0x00007fb665d031ea in mmc_pool_free (pool=0x7fb67a00e458) at /usr/src/debug/php-pecl-memcache-3.0.4/memcache-3.0.4/memcache_pool.c:917 #4 0x00007fb665d0a2f1 in ps_close_memcache (mod_data=0x7fb66d625440) at /usr/src/debug/php-pecl-memcache-3.0.4/memcache-3.0.4/memcache_session.c:185 #5 0x00007fb66d1b0935 in php_session_save_current_state () at /usr/src/debug/php-5.3.3/ext/session/session.c:625 #6 php_session_flush () at /usr/src/debug/php-5.3.3/ext/session/session.c:1517 #7 0x00007fb66d1b0c1b in zm_deactivate_session (type=<value optimized out>, module_number=<value optimized out>) at /usr/src/debug/php-5.3.3/ext/session/session.c:2171 #8 0x00007fb66d2a719c in module_registry_cleanup (module=<value optimized out>) at /usr/src/debug/php-5.3.3/Zend/zend_API.c:2150 #9 0x00007fb66d2b1994 in zend_hash_reverse_apply (ht=0x7fb66d629d60, apply_func=0x7fb66d2a7180 <module_registry_cleanup>) at /usr/src/debug/php-5.3.3/Zend/zend_hash.c:755 #10 0x00007fb66d2a5c0d in zend_deactivate_modules () at /usr/src/debug/php-5.3.3/Zend/zend.c:866 #11 0x00007fb66d2541b5 in php_request_shutdown (dummy=<value optimized out>) at /usr/src/debug/php-5.3.3/main/main.c:1607 #12 0x00007fb66d32e037 in php_apache_request_dtor (r=0x7fb67a229658) at /usr/src/debug/php-5.3.3/sapi/apache2handler/sapi_apache2.c:509 #13 php_handler (r=0x7fb67a229658) at /usr/src/debug/php-5.3.3/sapi/apache2handler/sapi_apache2.c:681 #14 0x00007fb6784166f0 in ap_run_handler (r=0x7fb67a229658) at /usr/src/debug/httpd-2.2.15/server/config.c:158 #15 0x00007fb678419f58 in ap_invoke_handler (r=0x7fb67a229658) at /usr/src/debug/httpd-2.2.15/server/config.c:372 #16 0x00007fb6784254f0 in ap_process_request (r=0x7fb67a229658) at /usr/src/debug/httpd-2.2.15/modules/http/http_request.c:282 #17 0x00007fb678422418 in ap_process_http_connection (c=0x7fb67a2193a8) at /usr/src/debug/httpd-2.2.15/modules/http/http_core.c:190 #18 0x00007fb67841e1b8 in ap_run_process_connection (c=0x7fb67a2193a8) at /usr/src/debug/httpd-2.2.15/server/connection.c:43 #19 0x00007fb678429f4b in child_main (child_num_arg=<value optimized out>) at /usr/src/debug/httpd-2.2.15/server/mpm/prefork/prefork.c:662 #20 0x00007fb67842a21a in make_child (s=0x7fb679cd7860, slot=153) at /usr/src/debug/httpd-2.2.15/server/mpm/prefork/prefork.c:758 #21 0x00007fb67842aea4 in perform_idle_server_maintenance (_pconf=<value optimized out>, plog=<value optimized out>, s=<value optimized out>) at /usr/src/debug/httpd-2.2.15/server/mpm/prefork/prefork.c:893 #22 ap_mpm_run (_pconf=<value optimized out>, plog=<value optimized out>, s=<value optimized out>) at /usr/src/debug/httpd-2.2.15/server/mpm/prefork/prefork.c:1097 #23 0x00007fb678402890 in main (argc=1, argv=0x7fff6fecacb8) at /usr/src/debug/httpd-2.2.15/server/main.c:740 PHP.INI Follows: [PHP] engine = On short_open_tag = On asp_tags = Off precision = 14 y2k_compliance = On output_buffering = 4096 zlib.output_compression = Off implicit_flush = Off unserialize_callback_func = serialize_precision = 100 allow_call_time_pass_reference = Off safe_mode = Off safe_mode_gid = Off safe_mode_include_dir = safe_mode_exec_dir = safe_mode_allowed_env_vars = PHP_ safe_mode_protected_env_vars = LD_LIBRARY_PATH disable_functions = disable_classes = expose_php = On max_execution_time = 30 max_input_time = 60 memory_limit = 128M error_reporting = E_ALL & ~E_DEPRECATED display_errors = Off display_startup_errors = Off log_errors = Off log_errors_max_len = 1024 ignore_repeated_errors = Off ignore_repeated_source = Off report_memleaks = On track_errors = Off html_errors = Off variables_order = "GPCS" request_order = "GP" register_globals = Off register_long_arrays = Off register_argc_argv = Off auto_globals_jit = On post_max_size = 8M magic_quotes_gpc = Off magic_quotes_runtime = Off magic_quotes_sybase = Off auto_prepend_file = auto_append_file = default_mimetype = "text/html" doc_root = user_dir = enable_dl = Off file_uploads = On upload_max_filesize = 2M allow_url_fopen = On allow_url_include = Off default_socket_timeout = 60 [Date] [filter] [iconv] [intl] [sqlite] [sqlite3] [Pcre] [Pdo] [Phar] [Syslog] define_syslog_variables = Off [mail function] SMTP = localhost smtp_port = 25 sendmail_path = /usr/sbin/sendmail -t -i mail.add_x_header = On [SQL] sql.safe_mode = Off [ODBC] odbc.allow_persistent = On odbc.check_persistent = On odbc.max_persistent = -1 odbc.max_links = -1 odbc.defaultlrl = 4096 odbc.defaultbinmode = 1 [MySQL] mysql.allow_persistent = On mysql.max_persistent = -1 mysql.max_links = -1 mysql.default_port = mysql.default_socket = mysql.default_host = mysql.default_user = mysql.default_password = mysql.connect_timeout = 60 mysql.trace_mode = Off [MySQLi] mysqli.max_links = -1 mysqli.default_port = 3306 mysqli.default_socket = mysqli.default_host = mysqli.default_user = mysqli.default_pw = mysqli.reconnect = Off [PostgresSQL] pgsql.allow_persistent = On pgsql.auto_reset_persistent = Off pgsql.max_persistent = -1 pgsql.max_links = -1 pgsql.ignore_notice = 0 pgsql.log_notice = 0 [Sybase-CT] sybct.allow_persistent = On sybct.max_persistent = -1 sybct.max_links = -1 sybct.min_server_severity = 10 sybct.min_client_severity = 10 [bcmath] bcmath.scale = 0 [browscap] [Session] session.save_handler = files session.save_path = "/var/lib/php/session" session.use_cookies = 1 session.use_only_cookies = 1 session.name = PHPSESSID session.auto_start = 1 session.cookie_lifetime = 0 session.cookie_path = / session.cookie_domain = session.cookie_httponly = session.serialize_handler = php session.gc_probability = 1 session.gc_divisor = 1000 session.gc_maxlifetime = 1440 session.bug_compat_42 = Off session.bug_compat_warn = Off session.referer_check = session.entropy_length = 0 session.entropy_file = session.cache_limiter = nocache session.cache_expire = 180 session.use_trans_sid = 0 session.hash_function = 0 session.hash_bits_per_character = 5 url_rewriter.tags = "a=href,area=href,frame=src,input=src,form=fakeentry" [MSSQL] mssql.allow_persistent = On mssql.max_persistent = -1 mssql.max_links = -1 mssql.min_error_severity = 10 mssql.min_message_severity = 10 mssql.compatability_mode = Off mssql.secure_connection = Off [Assertion] [COM] [mbstring] [gd] [exif] [Tidy] tidy.clean_output = Off [soap] soap.wsdl_cache_enabled=1 soap.wsdl_cache_dir="/tmp" soap.wsdl_cache_ttl=86400 /etc/php.d/memcached.ini : session.save_path="tcp://memcache1:11211?persistent=1&weight=1&timeout=3&retry_interval=15"

    Read the article

  • Oracle performance problem

    - by jreid42
    We are using an Oracle 11G machine that is very powerful; has redundant storage etc. It's a beast from what I have been told. We just got this DB for a tool that when I first came on as a coop had like 20 people using, now its upwards of 150 people. I am the only one working on it :( We currently have a system in place that distributes PERL scripts across our entire data center essentially giving us a sort of "grid" computing power. The Perl scripts run a sort of simulation and report back the results to the database. They do selects / inserts. The load is not very high for each script but it could be happening across 20-50 systems at the same time. We then have multiple data centers and users all hitting the same database with this same approach. Our main problem with this is that our database is getting overloaded with connections and having to drop some. We sometimes have upwards of 500 connections. These are old perl scripts and they do not handle this well. Essentially they fail and the results are lost. I would rather avoid having to rewrite a lot of these as they are poorly written, and are a headache to even look at. The database itself is not overloaded, just the connection overhead is too high. We open a connection, make a quick query and then drop the connection. Very short connections but many of them. The database team has basically said we need to lower the number of connections or they are going to ignore us. Because this is distributed across our farm we cant implement persistent connections. I do this with our webserver; but its on a fixed system. The other ones are perl scripts that get opened and closed by the distribution tool and thus arent always running. What would be my best approach to resolving this issue? The scripts themselves can wait for a connection to be open. They do not need to act immediately. Some sort of queing system? I've been suggested to set up a few instances of a tool called "SQL Relay". Maybe one in each data center. How reliable is this tool? How good is this approach? Would it work for what we need? We could have one for each data center and relay requests through it to our main database, keeping a pipeline of open persistent connections? Does this make sense? Is there any other suggestions you can make? Any ideas? Any help would be greatly appreciated. Sadly I am just a coop student working for a very big company and somehow all of this has landed all on my shoulders (there is literally nobody to ask for help; its a hardware company, everybody is hardware engineers, and the database team is useless and in India) and I am quite lost as what the best approach would be? I am extremely overworked and this problem is interfering with on going progress and basically needs to be resolved as quickly as possible; preferably without rewriting the whole system, purchasing hardware (not gonna happen), or shooting myself in the foot. HELP LOL!

    Read the article

  • Results Delphi users who wish to use HID USB in windows

    - by Lex Dean
    Results Delphi users who wish to use HID USB in windows HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Enum\USB Contain a list if keys containing vender ID and Producer ID numbers that co inside with the USB web sites data base. These numbers and the GUID held within the key gives important information to execute the HID.dll that is otherwise imposable to execute. The Control Panel/System/Hardware/Device manager/USB Serial Bus Controllers/Mass Storage Devices/details simply lists the registry data. The access to the programmer has been documented through the API32.dll with a number of procedures that accesses the registry. But that is not the problem yet it looks like the problem!!!!!!!!! The key is info about the registry and how to use it. These keys are viewed in RegEdit.exe it’s self. Some parts of the registry like the USB have been given a windows security system type of protection with a Aurthz.dll to give the USB read and right protection. Even the api32.dll. Now only Microsoft give out these details and we all know Microsoft hate Delphi. Now C users have enjoyed this access for over 10 years now. Now some will make out that you should never give out such information because some idiot may make a stupid virus (true), but the argument is also do Delphi users need to be denied USB access for another ten years!!!!!!!!!!!!. What I do not have is the skill in is assembly code. I’m seeking for some one that can trace how regedit.exe gets its access through Aurthz.dll to access the USB data. So I’m asking all who reads this:- to partition any friend they have that has this skill to get the Aurthz.dll info needed. I find communicating with USB.org they reply when they have a positive email reply but do not bother should their email be a slightly negative policy. For all simple reasoning, all that USB had to do was to have a secure key as they have done, and to update the same data into a unsecured key every time the data is changed for USB developer to access. And not bother developers access to Aurthz.dll. Authz.dll with these functions for USB:- AuthzFreeResourceManager AuthzFreeContext AuthzAccessCheck(Flags: DWORD; AuthzClientContext: AUTHZ_CLIENT_CONTEXT_HANDLE; pRequest: PAUTHZ_ACCESS_REQUEST; AuditInfo: AUTHZ_AUDIT_INFO_HANDLE; pSecurityDescriptor: PSECURITY_DESCRIPTOR; OptionalSecurityDescriptorArray: PSECURITY_DESCRIPTOR; OptionalSecurityDescriptorCount: DWORD; //OPTIONAL, Var pReply: AUTHZ_ACCESS_REPLY; pAuthzHandle: PAUTHZ_ACCESS_CHECK_RESULTS_HANDLE): BOOl; AuthzInitializeContextFromSid(Flags: DWORD; UserSid: PSID; AuthzResourceManager: AUTHZ_RESOURCE_MANAGER_HANDLE; pExpirationTime: int64; Identifier: LUID; DynamicGroupArgs: PVOID; pAuthzClientContext: PAUTHZ_CLIENT_CONTEXT_HANDLE): BOOL; AuthzInitializeResourceManager(flags: DWORD; pfnAccessCheck: PFN_AUTHZ_DYNAMIC_ACCESS_CHECK; pfnComputeDynamicGroups: PFN_AUTHZ_COMPUTE_DYNAMIC_GROUPS; pfnFreeDynamicGroups: PFN_AUTHZ_FREE_DYNAMIC_GROUPS; ResourceManagerName: PWideChar; pAuthzResourceManager: PAUTHZ_RESOURCE_MANAGER_HANDLE): BOOL; further in Authz.h on kolers.com J Lex Dean.

    Read the article

  • How to convert html textfield/area data to server-side txt file? [closed]

    - by olijake
    How can I make a script that will convert the text/data in a html textfield/textarea and send it to the server, which then saves it as a .txt file for storage? NOTE: I am hosting a website(for testing purposes) using Apache 2.2 on a Windows 7 machine. I downloaded PHP version 5.4.7, but have not yet installed on my server yet (not sure if I will need it, but also not sure how to install it). 1st problem: Saving text to server Html page/section with title textfield, text textarea, and submit button. You would enter a title, the text/notes you need in the textfield, then press the submit button to have it store the text in the textarea, as a .txt file on the server called .txt. 2nd problem: Opening text from server Html with list of all txt files OR textfield for entering in title, then submit button to send the title of the requested .txt file to the server, which would then load it up on the page. Here is what I have so far: (let me know if there is something that I should change or if something just isn't correct in the index.html code I have right now.) <!DOCTYPE HTML> <html> <head> <title>Insert Title</title> <meta http-equiv="Content-Type" content="Text/HTML; charset=UTF-8"/> </head> <body> <form method="post" action="save.INSERT_FILETYPE" name="textfile" enctype="multipart/form-data"> <input type="text" name="title"><br/> <textarea rows="20" cols="100" id="text" name="text"></textarea><br/> <input type="submit" name="submit" value="Submit Text to Server"> </form><br/> <hr style="width: 100%; height: 4px;"><br/> <form method="post" action="open.INSERT_FILETYPE" name="textfile" enctype="multipart/form-data"> <input type="text" name="title"><br/> <input type="submit" name="submit" value="Submit Txt File Request"> </form><br/> <div>Opened text file displays here or goes on another page</div> </body> </html> I plan on using a server side language/script, but ANYTHING that gets the job done is fine. I already tried looking into using some ASP/jScript/PHP, but have had some trouble implementing it into my server. (ie: getting the modules loaded and telling the server what file types to parse.) I know this may be an extremely easy fix, but then in that case, hopefully you wouldn't mind helping me out a little :). If it turns out that this is MUCH more complicated than I expect, then feel free to let me know that, so I don't waste me time running in circles. I appreciate any help/assistance that you can provide, Thanks, Jake EDIT: Wrong Apache version. In response to the comments/closing of this thread: My question: "How exactly do I install the PHP module on the apache server? and is this even possible? and is this even recommended?" ^ In case I wasn't clear enough already To Clarify: I understand the basics of PHP, I just have trouble with INSTALLING PHP on the apache server. (I have used PHP before, but never successfully on apache (so far...)) For my script I wrote something similar to this already (using fopen() and a few other commands): <?php fopen("notes.txt", "r"); file_put_contents("notes.txt",teststring1); ?> I have used javascript for this task before also (although I prefer using PHP and server-side languages): <script language="javascript"> function WriteToFile(){ var fso = new ActiveXObject("Scripting.FileSystemObject"); var s = fso.CreateTextFile("C:\\NewFile.txt", true); var text=document.getElementById("TextArea1").innerText; s.WriteLine(text); s.WriteLine('***********************'); s.Close(); } </script>

    Read the article

< Previous Page | 351 352 353 354 355 356 357 358 359 360 361 362  | Next Page >