Search Results

Search found 14216 results on 569 pages for 'nvidia settings'.

Page 103/569 | < Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >

  • Convert DVD to MP4 / H.264 with HD Decrypter and Handbrake

    - by DigitalGeekery
    Are you looking for a way to convert your DVD collection to high quality MP4 files? Today we are going to take a look at using DVDFab HD Decrypter along with Handbrake to convert DVDs to MP4 using the H.264 codec.  Process Overview Handbrake is a great file conversion application, but it unfortunately can’t handle DVD copy protection. For that we will use DVDFab’s HD Decrypter. HD Decrypter is the always free portion of the DVDFab application. What HD Decrypter will do, is remove the copy protection from your DVD, and copy the Video-TS and Audio-TS folders to your hard drive. Once the copy protection is gone, we will use Handbrake to convert the files to MP4 format with H.264 compression. Note: You’ll get full access to all the options in DVDFab  during the 30 trial period. However, the HD Decrypter is free and will continue to work. Ripping the DVD Install both Handbrake and DVDFab HD Decrypter. (Download links below) Once the applications are installed, place your DVD into your DVD drive and open DVDFab. On the welcome screen, click “Start DVDFab.”   You’ll be prompted to choose your region. Click “OK.” The disc is analyzed and opened… You’ll be brought to the main interface. Make sure you have the Full Disc option selected at the left panel and “Copy DVD-Video (VIDEO_TS folder) is selected. Click “Start.” Don’t be confused by the “DVD to DVD” option pop up. We won’t actually be burning to DVD. The HD Decrypter portion of the DVDFab suite is part of the DVD to DVD option. Click “OK.” The DVD will be ripped to your hard drive. When the copy process is complete, you’ll be prompted to insert media to start the write process. We aren’t going to be burning to disc, so just click Cancel then close out of DVDFab.   Converting to MP4 Now we are ready to convert Open Handbrake and click on the “Source” button at the top left. Select DVD / VIDEO_TS folder from the drop down list. Now we need to browse for the location where DVDFab HD Decrypter copied your movie. By default, that location will be the \DVDFab\Temp\FullDisc directory in your Documents folder. For example, in Windows 7, it would be: C:\Users\%username%\Documents\DVDFab\Temp\FullDisc\[Name of Your DVD] Select the folder, and click “OK.” You may be prompted to set a default path in Handbrake. This is an optional step. Click “OK.” If you’d like to set a default destination folder, Go to Tools on the top menu, select Options. On the General tab, click “Browse” to select a destination output folder. Click “Close” when Finished.   Next, click the dropdown list next to “Title.” Select the title that matches the length of the movie. It’s possible you may have see more than one title with a similar length. If so, consult the DVD information, or a site like IMDB.com, to find the proper movie title length. Select your container under Output Settings. This will be your final output file extension. We will be using MP4 for this example. You also have the option of MKV.   If you didn’t set up a default destination folder, you’ll need to select one by clicking the “Browse” button. You can manually customize the output file name and change the output file extension to .mp4 (Unless you prefer the iPod friendly .m4v extension). Settings There are a variety of custom settings that can be changed either through the tabs listed under Output Settings, or by selecting one of the Presets to the right. If converting exclusively for any of the devices listed in the preset list, simply click on that device and the settings will be automatically applied in the Output Settings tabs. For more Universal (non-Apple) devices or output, select the Normal profile.   For the most part, the presets will suit quite nicely. However, you can further customize settings if you’d like. The Picture tab allows you to tweak the size or cropping region. You must change Anamorphic to Loose or Custom to change the size.   The Video tab allows you to choose your codec. H.264 is the default. You also have the option to choose a target (output) size. The Constant Quality is recommended to be set between 59% – 63%. Anything over 70% will likely result in an output file larger than the input without any improved quality. On the Subtitles tab, you can select an available subtitle from the dropdown list and click “Add” to add it to the output file. When you’ve finished any customizations you are ready to begin the conversion process. Click “Start.” A Command window will open and you can follow the process. You’ll probably want to find something to do in the meantime as the process could take a couple of hours. When the process completes, you’re ready to watch your video.   Although it’s a time consuming process that involves a couple steps, this method will give you high quality H.264 video files. If you want to rip and burn your DVD’s to ISO check out our article on how to rip and convert DVD’s to an ISO image. Links Download DVDFab HD Decrypter (Part of the DVDFab suite) Download Handbrake Similar Articles Productive Geek Tips Enjoy Quick & Easy Unit Conversion with Convert for WindowsConvert Older Excel Documents to Excel 2007 FormatCalculate with Qalculate on LinuxHow To Convert Video Files to MP3 with VLCConvert a Row to a Column in Excel the Easy Way TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional Use Quick Translator to Translate Text in 50 Languages (Firefox) Get Better Windows Search With UltraSearch Scan News With NY Times Article Skimmer SpeedyFox Claims to Speed up your Firefox Beware Hover Kitties Test Drive Mobile Phones Online With TryPhone

    Read the article

  • EM12c Release 4: New EMCLI Verbs

    - by SubinDaniVarughese
    Here are the new EM CLI verbs in Enterprise Manager 12c Release 4 (12.1.0.4). This helps you in writing new scripts or enhancing your existing scripts for further automation. Basic Administration Verbs invoke_ws - Invoke EM web service.ADM Verbs associate_target_to_adm - Associate a target to an application data model. export_adm - Export Application Data Model to a specified .xml file. import_adm - Import Application Data Model from a specified .xml file. list_adms - List the names, target names and application suites of existing Application Data Models verify_adm - Submit an application data model verify job for the target specified.Agent Update Verbs get_agent_update_status -  Show Agent Update Results get_not_updatable_agents - Shows Not Updatable Agents get_updatable_agents - Show Updatable Agents update_agents - Performs Agent Update Prereqs and submits Agent Update JobBI Publisher Reports Verbs grant_bipublisher_roles - Grants access to the BI Publisher catalog and features. revoke_bipublisher_roles - Revokes access to the BI Publisher catalog and features.Blackout Verbs create_rbk - Create a Retro-active blackout.CFW Verbs cancel_cloud_service_requests -  To cancel cloud service requests delete_cloud_service_instances -  To delete cloud service instances delete_cloud_user_objects - To delete cloud user objects. get_cloud_service_instances - To get information about cloud service instances get_cloud_service_requests - To get information about cloud requests get_cloud_user_objects - To get information about cloud user objects.Chargeback Verbs add_chargeback_entity - Adds the given entity to Chargeback. assign_charge_plan - Assign a plan to a chargeback entity. assign_cost_center - Assign a cost center to a chargeback entity. create_charge_entity_type - Create  charge entity type export_charge_plans - Exports charge plans metadata to file export_custom_charge_items -  Exports user defined charge items to a file import_charge_plans - Imports charge plans metadata from given file import_custom_charge_items -  Imports user defined charge items metadata from given file list_charge_plans - Gives a list of charge plans in Chargeback. list_chargeback_entities - Gives a list of all the entities in Chargeback list_chargeback_entity_types - Gives a list of all the entity types that are supported in Chargeback list_cost_centers - Lists the cost centers in Chargeback. remove_chargeback_entity - Removes the given entity from Chargeback. unassign_charge_plan - Un-assign the plan associated to a chargeback entity. unassign_cost_center - Un-assign the cost center associated to a chargeback entity.Configuration/Association History disable_config_history - Disable configuration history computation for a target type. enable_config_history - Enable configuration history computation for a target type. set_config_history_retention_period - Sets the amount of time for which Configuration History is retained.ConfigurationCompare config_compare - Submits the configuration comparison job get_config_templates - Gets all the comparison templates from the repositoryCompliance Verbs fix_compliance_state -  Fix compliance state by removing references in deleted targets.Credential Verbs update_credential_setData Subset Verbs export_subset_definition - Exports specified subset definition as XML file at specified directory path. generate_subset - Generate subset using specified subset definition and target database. import_subset_definition - Import a subset definition from specified XML file. import_subset_dump - Imports dump file into specified target database. list_subset_definitions - Get the list of subset definition, adm and target nameDelete pluggable Database Job Verbs delete_pluggable_database - Delete a pluggable databaseDeployment Procedure Verbs get_runtime_data - Get the runtime data of an executionDiscover and Push to Agents Verbs generate_discovery_input - Generate Discovery Input file for discovering Auto-Discovered Domains refresh_fa - Refresh Fusion Instance run_fa_diagnostics - Run Fusion Applications DiagnosticsFusion Middleware Provisioning Verbs create_fmw_domain_profile - Create a Fusion Middleware Provisioning Profile from a WebLogic Domain create_fmw_home_profile - Create a Fusion Middleware Provisioning Profile from an Oracle Home create_inst_media_profile - Create a Fusion Middleware Provisioning Profile from Installation MediaGold Agent Image Verbs create_gold_agent_image - Creates a gold agent image. decouple_gold_agent_image - Decouples the agent from gold agent image. delete_gold_agent_image - Deletes a gold agent image. get_gold_agent_image_activity_status -  Gets gold agent image activity status. get_gold_agent_image_details - Get the gold agent image details. list_agents_on_gold_image - Lists agents on a gold agent image. list_gold_agent_image_activities - Lists gold agent image activities. list_gold_agent_image_series - Lists gold agent image series. list_gold_agent_images - Lists the available gold agent images. promote_gold_agent_image - Promotes a gold agent image. stage_gold_agent_image - Stages a gold agent image.Incident Rules Verbs add_target_to_rule_set - Add a target to an enterprise rule set. delete_incident_record - Delete one or more open incidents remove_target_from_rule_set - Remove a target from an enterprise rule set. Job Verbs export_jobs - Export job details in to an xml file import_jobs - Import job definitions from an xml file job_input_file - Supply details for a job verb in a property file resume_job - Resume a job or set of jobs suspend_job - Suspend a job or set of jobs Oracle Database as Service Verbs config_db_service_target - Configure DB Service target for OPCPrivilege Delegation Settings Verbs clear_default_privilege_delegation_setting - Clears the default privilege delegation setting for a given list of platforms set_default_privilege_delegation_setting - Sets the default privilege delegation setting for a given list of platforms test_privilege_delegation_setting - Tests a Privilege Delegation Setting on a hostSSA Verbs cleanup_dbaas_requests - Submit cleanup request for failed request create_dbaas_quota - Create Database Quota for a SSA User Role create_service_template - Create a Service Template delete_dbaas_quota - Delete the Database Quota setup for a SSA User Role delete_service_template - Delete a given service template get_dbaas_quota - List the Database Quota setup for all SSA User Roles get_dbaas_request_settings - List the Database Request Settings get_service_template_detail - Get details of a given service template get_service_templates -  Get the list of available service templates rename_service_template -  Rename a given service template update_dbaas_quota - Update the Database Quota for a SSA User Role update_dbaas_request_settings - Update the Database Request Settings update_service_template -  Update a given service template. SavedConfigurations get_saved_configs  - Gets the saved configurations from the repository Server Generated Alert Metric Verbs validate_server_generated_alerts  - Server Generated Alert Metric VerbServices Verbs edit_sl_rule - Edit the service level rule for the specified serviceSiebel Verbs list_siebel_enterprises -  List Siebel enterprises currently monitored in EM list_siebel_servers -  List Siebel servers under a specified siebel enterprise update_siebel- Update a Siebel enterprise or its underlying serversSiteGuard Verbs add_siteguard_aux_hosts -  Associate new auxiliary hosts to the system configure_siteguard_lag -  Configure apply lag and transport lag limit for databases delete_siteguard_aux_host -  Delete auxiliary host associated with a site delete_siteguard_lag -  Erases apply lag or transport lag limit for databases get_siteguard_aux_hosts -  Get all auxiliary hosts associated with a site get_siteguard_health_checks -  Shows schedule of health checks get_siteguard_lag -  Shows apply lag or transport lag limit for databases schedule_siteguard_health_checks -  Schedule health checks for an operation plan stop_siteguard_health_checks -  Stops all future health check execution of an operation plan update_siteguard_lag -  Updates apply lag and transport lag limit for databasesSoftware Library Verbs stage_swlib_entity_files -  Stage files of an entity from Software Library to a host target.Target Data Verbs create_assoc - Creates target associations delete_assoc - Deletes target associations list_allowed_pairs - Lists allowed association types for specified source and destination list_assoc - Lists associations between source and destination targets manage_agent_partnership - Manages partnership between agents. Used for explicitly assigning agent partnershipsTrace Reports generate_ui_trace_report  -  Generate and download UI Page performance report (to identify slow rendering pages)VI EMCLI Verbs add_virtual_platform - Add Oracle Virtual PLatform(s). modify_virtual_platform - Modify Oracle Virtual Platform.To get more details about each verb, execute$ emcli help <verb_name>Example: $ emcli help list_assocNew resources in list verbThese are the new resources in EM CLI list verb :Certificates  WLSCertificateDetails Credential Resource Group  PreferredCredentialsDefaultSystemScope - Preferred credentials (System Scope)   PreferredCredentialsSystemScope - Target preferred credentialPrivilege Delegation Settings  TargetPrivilegeDelegationSettingDetails  - List privilege delegation setting details on a host  TargetPrivilegeDelegationSetting - List privilege delegation settings on a host   PrivilegeDelegationSettings  - Lists all Privilege Delegation Settings   PrivilegeDelegationSettingDetails - Lists details of  Privilege Delegation Settings To get more details about each resource, execute$ emcli list -resource="<resource_name>" -helpExample: $ emcli list -resource="PrivilegeDelegationSettings" -helpDeprecated Verbs:Agent Administration Verbs resecure_agent - Resecure an agentTo get the complete list of verbs, execute:$ emcli help Stay Connected: Twitter | Facebook | YouTube | Linkedin | Newsletter Download the Oracle Enterprise Manager 12c Mobile app

    Read the article

  • GPGPU

    WhatGPU obviously stands for Graphics Processing Unit (the silicon powering the display you are using to read this blog post). The extra GP in front of that stands for General Purpose computing.So, altogether GPGPU refers to computing we can perform on GPU for purposes beyond just drawing on the screen. In effect, we can use a GPGPU a bit like we already use a CPU: to perform some calculation (that doesn’t have to have any visual element to it). The attraction is that a GPGPU can be orders of magnitude faster than a CPU.WhyWhen I was at the SuperComputing conference in Portland last November, GPGPUs were all the rage. A quick online search reveals many articles introducing the GPGPU topic. I'll just share 3 here: pcper (ignoring all pages except the first, it is a good consumer perspective), gizmodo (nice take using mostly layman terms) and vizworld (answering the question on "what's the big deal").The GPGPU programming paradigm (from a high level) is simple: in your CPU program you define functions (aka kernels) that take some input, can perform the costly operation and return the output. The kernels are the things that execute on the GPGPU leveraging its power (and hence execute faster than what they could on the CPU) while the host CPU program waits for the results or asynchronously performs other tasks.However, GPGPUs have different characteristics to CPUs which means they are suitable only for certain classes of problem (i.e. data parallel algorithms) and not for others (e.g. algorithms with branching or recursion or other complex flow control). You also pay a high cost for transferring the input data from the CPU to the GPU (and vice versa the results back to the CPU), so the computation itself has to be long enough to justify the overhead transfer costs. If your problem space fits the criteria then you probably want to check out this technology.HowSo where can you get a graphics card to start playing with all this? At the time of writing, the two main vendors ATI (owned by AMD) and NVIDIA are the obvious players in this industry. You can read about GPGPU on this AMD page and also on this NVIDIA page. NVIDIA's website also has a free chapter on the topic from the "GPU Gems" book: A Toolkit for Computation on GPUs.If you followed the links above, then you've already come across some of the choices of programming models that are available today. Essentially, AMD is offering their ATI Stream technology accessible via a language they call Brook+; NVIDIA offers their CUDA platform which is accessible from CUDA C. Choosing either of those locks you into the GPU vendor and hence your code cannot run on systems with cards from the other vendor (e.g. imagine if your CPU code would run on Intel chips but not AMD chips). Having said that, both vendors plan to support a new emerging standard called OpenCL, which theoretically means your kernels can execute on any GPU that supports it. To learn more about all of these there is a website: gpgpu.org. The caveat about that site is that (currently) it completely ignores the Microsoft approach, which I touch on next.On Windows, there is already a cross-GPU-vendor way of programming GPUs and that is the DirectX API. Specifically, on Windows Vista and Windows 7, the DirectX 11 API offers a dedicated subset of the API for GPGPU programming: DirectCompute. You use this API on the CPU side, to set up and execute the kernels that run on the GPU. The kernels are written in a language called HLSL (High Level Shader Language). You can use DirectCompute with HLSL to write a "compute shader", which is the term DirectX uses for what I've been referring to in this post as a "kernel". For a comprehensive collection of links about this (including tutorials, videos and samples) please see my blog post: DirectCompute.Note that there are many efforts to build even higher level languages on top of DirectX that aim to expose GPGPU programming to a wider audience by making it as easy as today's mainstream programming models. I'll mention here just two of those efforts: Accelerator from MSR and Brahma by Ananth. Comments about this post welcome at the original blog.

    Read the article

  • Node.js Adventure - Storage Services and Service Runtime

    - by Shaun
    When I described on how to host a Node.js application on Windows Azure, one of questions might be raised about how to consume the vary Windows Azure services, such as the storage, service bus, access control, etc.. Interact with windows azure services is available in Node.js through the Windows Azure Node.js SDK, which is a module available in NPM. In this post I would like to describe on how to use Windows Azure Storage (a.k.a. WAS) as well as the service runtime.   Consume Windows Azure Storage Let’s firstly have a look on how to consume WAS through Node.js. As we know in the previous post we can host Node.js application on Windows Azure Web Site (a.k.a. WAWS) as well as Windows Azure Cloud Service (a.k.a. WACS). In theory, WAWS is also built on top of WACS worker roles with some more features. Hence in this post I will only demonstrate for hosting in WACS worker role. The Node.js code can be used when consuming WAS when hosted on WAWS. But since there’s no roles in WAWS, the code for consuming service runtime mentioned in the next section cannot be used for WAWS node application. We can use the solution that I created in my last post. Alternatively we can create a new windows azure project in Visual Studio with a worker role, add the “node.exe” and “index.js” and install “express” and “node-sqlserver” modules, make all files as “Copy always”. In order to use windows azure services we need to have Windows Azure Node.js SDK, as knows as a module named “azure” which can be installed through NPM. Once we downloaded and installed, we need to include them in our worker role project and make them as “Copy always”. You can use my “Copy all always” tool mentioned in my last post to update the currently worker role project file. You can also find the source code of this tool here. The source code of Windows Azure SDK for Node.js can be found in its GitHub page. It contains two parts. One is a CLI tool which provides a cross platform command line package for Mac and Linux to manage WAWS and Windows Azure Virtual Machines (a.k.a. WAVM). The other is a library for managing and consuming vary windows azure services includes tables, blobs, queues, service bus and the service runtime. I will not cover all of them but will only demonstrate on how to use tables and service runtime information in this post. You can find the full document of this SDK here. Back to Visual Studio and open the “index.js”, let’s continue our application from the last post, which was working against Windows Azure SQL Database (a.k.a. WASD). The code should looks like this. 1: var express = require("express"); 2: var sql = require("node-sqlserver"); 3:  4: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd={PASSWORD};Encrypt=yes;Connection Timeout=30;"; 5: var port = 80; 6:  7: var app = express(); 8:  9: app.configure(function () { 10: app.use(express.bodyParser()); 11: }); 12:  13: app.get("/", function (req, res) { 14: sql.open(connectionString, function (err, conn) { 15: if (err) { 16: console.log(err); 17: res.send(500, "Cannot open connection."); 18: } 19: else { 20: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 21: if (err) { 22: console.log(err); 23: res.send(500, "Cannot retrieve records."); 24: } 25: else { 26: res.json(results); 27: } 28: }); 29: } 30: }); 31: }); 32:  33: app.get("/text/:key/:culture", function (req, res) { 34: sql.open(connectionString, function (err, conn) { 35: if (err) { 36: console.log(err); 37: res.send(500, "Cannot open connection."); 38: } 39: else { 40: var key = req.params.key; 41: var culture = req.params.culture; 42: var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'"; 43: conn.queryRaw(command, function (err, results) { 44: if (err) { 45: console.log(err); 46: res.send(500, "Cannot retrieve records."); 47: } 48: else { 49: res.json(results); 50: } 51: }); 52: } 53: }); 54: }); 55:  56: app.get("/sproc/:key/:culture", function (req, res) { 57: sql.open(connectionString, function (err, conn) { 58: if (err) { 59: console.log(err); 60: res.send(500, "Cannot open connection."); 61: } 62: else { 63: var key = req.params.key; 64: var culture = req.params.culture; 65: var command = "EXEC GetItem '" + key + "', '" + culture + "'"; 66: conn.queryRaw(command, function (err, results) { 67: if (err) { 68: console.log(err); 69: res.send(500, "Cannot retrieve records."); 70: } 71: else { 72: res.json(results); 73: } 74: }); 75: } 76: }); 77: }); 78:  79: app.post("/new", function (req, res) { 80: var key = req.body.key; 81: var culture = req.body.culture; 82: var val = req.body.val; 83:  84: sql.open(connectionString, function (err, conn) { 85: if (err) { 86: console.log(err); 87: res.send(500, "Cannot open connection."); 88: } 89: else { 90: var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', N'" + val + "')"; 91: conn.queryRaw(command, function (err, results) { 92: if (err) { 93: console.log(err); 94: res.send(500, "Cannot retrieve records."); 95: } 96: else { 97: res.send(200, "Inserted Successful"); 98: } 99: }); 100: } 101: }); 102: }); 103:  104: app.listen(port); Now let’s create a new function, copy the records from WASD to table service. 1. Delete the table named “resource”. 2. Create a new table named “resource”. These 2 steps ensures that we have an empty table. 3. Load all records from the “resource” table in WASD. 4. For each records loaded from WASD, insert them into the table one by one. 5. Prompt to user when finished. In order to use table service we need the storage account and key, which can be found from the developer portal. Just select the storage account and click the Manage Keys button. Then create two local variants in our Node.js application for the storage account name and key. Since we need to use WAS we need to import the azure module. Also I created another variant stored the table name. In order to work with table service I need to create the storage client for table service. This is very similar as the Windows Azure SDK for .NET. As the code below I created a new variant named “client” and use “createTableService”, specified my storage account name and key. 1: var azure = require("azure"); 2: var storageAccountName = "synctile"; 3: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw=="; 4: var tableName = "resource"; 5: var client = azure.createTableService(storageAccountName, storageAccountKey); Now create a new function for URL “/was/init” so that we can trigger it through browser. Then in this function we will firstly load all records from WASD. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: } 18: } 19: }); 20: } 21: }); 22: }); When we succeed loaded all records we can start to transform them into table service. First I need to recreate the table in table service. This can be done by deleting and creating the table through table client I had just created previously. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: // transform the records 26: } 27: }); 28: }); 29: } 30: } 31: }); 32: } 33: }); 34: }); As you can see, the azure SDK provide its methods in callback pattern. In fact, almost all modules in Node.js use the callback pattern. For example, when I deleted a table I invoked “deleteTable” method, provided the name of the table and a callback function which will be performed when the table had been deleted or failed. Underlying, the azure module will perform the table deletion operation in POSIX async threads pool asynchronously. And once it’s done the callback function will be performed. This is the reason we need to nest the table creation code inside the deletion function. If we perform the table creation code after the deletion code then they will be invoked in parallel. Next, for each records in WASD I created an entity and then insert into the table service. Finally I send the response to the browser. Can you find a bug in the code below? I will describe it later in this post. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: // transform the records 26: for (var i = 0; i < results.rows.length; i++) { 27: var entity = { 28: "PartitionKey": results.rows[i][1], 29: "RowKey": results.rows[i][0], 30: "Value": results.rows[i][2] 31: }; 32: client.insertEntity(tableName, entity, function (error) { 33: if (error) { 34: error["target"] = "insertEntity"; 35: res.send(500, error); 36: } 37: else { 38: console.log("entity inserted"); 39: } 40: }); 41: } 42: // send the 43: console.log("all done"); 44: res.send(200, "All done!"); 45: } 46: }); 47: }); 48: } 49: } 50: }); 51: } 52: }); 53: }); Now we can publish it to the cloud and have a try. But normally we’d better test it at the local emulator first. In Node.js SDK there are three build-in properties which provides the account name, key and host address for local storage emulator. We can use them to initialize our table service client. We also need to change the SQL connection string to let it use my local database. The code will be changed as below. 1: // windows azure sql database 2: //var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd=eszqu94XZY;Encrypt=yes;Connection Timeout=30;"; 3: // sql server 4: var connectionString = "Driver={SQL Server Native Client 11.0};Server={.};Database={Caspar};Trusted_Connection={Yes};"; 5:  6: var azure = require("azure"); 7: var storageAccountName = "synctile"; 8: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw=="; 9: var tableName = "resource"; 10: // windows azure storage 11: //var client = azure.createTableService(storageAccountName, storageAccountKey); 12: // local storage emulator 13: var client = azure.createTableService(azure.ServiceClient.DEVSTORE_STORAGE_ACCOUNT, azure.ServiceClient.DEVSTORE_STORAGE_ACCESS_KEY, azure.ServiceClient.DEVSTORE_TABLE_HOST); Now let’s run the application and navigate to “localhost:12345/was/init” as I hosted it on port 12345. We can find it transformed the data from my local database to local table service. Everything looks fine. But there is a bug in my code. If we have a look on the Node.js command window we will find that it sent response before all records had been inserted, which is not what I expected. The reason is that, as I mentioned before, Node.js perform all IO operations in non-blocking model. When we inserted the records we executed the table service insert method in parallel, and the operation of sending response was also executed in parallel, even though I wrote it at the end of my logic. The correct logic should be, when all entities had been copied to table service with no error, then I will send response to the browser, otherwise I should send error message to the browser. To do so I need to import another module named “async”, which helps us to coordinate our asynchronous code. Install the module and import it at the beginning of the code. Then we can use its “forEach” method for the asynchronous code of inserting table entities. The first argument of “forEach” is the array that will be performed. The second argument is the operation for each items in the array. And the third argument will be invoked then all items had been performed or any errors occurred. Here we can send our response to browser. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: async.forEach(results.rows, 26: // transform the records 27: function (row, callback) { 28: var entity = { 29: "PartitionKey": row[1], 30: "RowKey": row[0], 31: "Value": row[2] 32: }; 33: client.insertEntity(tableName, entity, function (error) { 34: if (error) { 35: callback(error); 36: } 37: else { 38: console.log("entity inserted."); 39: callback(null); 40: } 41: }); 42: }, 43: // send reponse 44: function (error) { 45: if (error) { 46: error["target"] = "insertEntity"; 47: res.send(500, error); 48: } 49: else { 50: console.log("all done"); 51: res.send(200, "All done!"); 52: } 53: } 54: ); 55: } 56: }); 57: }); 58: } 59: } 60: }); 61: } 62: }); 63: }); Run it locally and now we can find the response was sent after all entities had been inserted. Query entities against table service is simple as well. Just use the “queryEntity” method from the table service client and providing the partition key and row key. We can also provide a complex query criteria as well, for example the code here. In the code below I queried an entity by the partition key and row key, and return the proper localization value in response. 1: app.get("/was/:key/:culture", function (req, res) { 2: var key = req.params.key; 3: var culture = req.params.culture; 4: client.queryEntity(tableName, culture, key, function (error, entity) { 5: if (error) { 6: res.send(500, error); 7: } 8: else { 9: res.json(entity); 10: } 11: }); 12: }); And then tested it on local emulator. Finally if we want to publish this application to the cloud we should change the database connection string and storage account. For more information about how to consume blob and queue service, as well as the service bus please refer to the MSDN page.   Consume Service Runtime As I mentioned above, before we published our application to the cloud we need to change the connection string and account information in our code. But if you had played with WACS you should have known that the service runtime provides the ability to retrieve configuration settings, endpoints and local resource information at runtime. Which means we can have these values defined in CSCFG and CSDEF files and then the runtime should be able to retrieve the proper values. For example we can add some role settings though the property window of the role, specify the connection string and storage account for cloud and local. And the can also use the endpoint which defined in role environment to our Node.js application. In Node.js SDK we can get an object from “azure.RoleEnvironment”, which provides the functionalities to retrieve the configuration settings and endpoints, etc.. In the code below I defined the connection string variants and then use the SDK to retrieve and initialize the table client. 1: var connectionString = ""; 2: var storageAccountName = ""; 3: var storageAccountKey = ""; 4: var tableName = ""; 5: var client; 6:  7: azure.RoleEnvironment.getConfigurationSettings(function (error, settings) { 8: if (error) { 9: console.log("ERROR: getConfigurationSettings"); 10: console.log(JSON.stringify(error)); 11: } 12: else { 13: console.log(JSON.stringify(settings)); 14: connectionString = settings["SqlConnectionString"]; 15: storageAccountName = settings["StorageAccountName"]; 16: storageAccountKey = settings["StorageAccountKey"]; 17: tableName = settings["TableName"]; 18:  19: console.log("connectionString = %s", connectionString); 20: console.log("storageAccountName = %s", storageAccountName); 21: console.log("storageAccountKey = %s", storageAccountKey); 22: console.log("tableName = %s", tableName); 23:  24: client = azure.createTableService(storageAccountName, storageAccountKey); 25: } 26: }); In this way we don’t need to amend the code for the configurations between local and cloud environment since the service runtime will take care of it. At the end of the code we will listen the application on the port retrieved from SDK as well. 1: azure.RoleEnvironment.getCurrentRoleInstance(function (error, instance) { 2: if (error) { 3: console.log("ERROR: getCurrentRoleInstance"); 4: console.log(JSON.stringify(error)); 5: } 6: else { 7: console.log(JSON.stringify(instance)); 8: if (instance["endpoints"] && instance["endpoints"]["nodejs"]) { 9: var endpoint = instance["endpoints"]["nodejs"]; 10: app.listen(endpoint["port"]); 11: } 12: else { 13: app.listen(8080); 14: } 15: } 16: }); But if we tested the application right now we will find that it cannot retrieve any values from service runtime. This is because by default, the entry point of this role was defined to the worker role class. In windows azure environment the service runtime will open a named pipeline to the entry point instance, so that it can connect to the runtime and retrieve values. But in this case, since the entry point was worker role and the Node.js was opened inside the role, the named pipeline was established between our worker role class and service runtime, so our Node.js application cannot use it. To fix this problem we need to open the CSDEF file under the azure project, add a new element named Runtime. Then add an element named EntryPoint which specify the Node.js command line. So that the Node.js application will have the connection to service runtime, then it’s able to read the configurations. Start the Node.js at local emulator we can find it retrieved the connections, storage account for local. And if we publish our application to azure then it works with WASD and storage service through the configurations for cloud.   Summary In this post I demonstrated how to use Windows Azure SDK for Node.js to interact with storage service, especially the table service. I also demonstrated on how to use WACS service runtime, how to retrieve the configuration settings and the endpoint information. And in order to make the service runtime available to my Node.js application I need to create an entry point element in CSDEF file and set “node.exe” as the entry point. I used five posts to introduce and demonstrate on how to run a Node.js application on Windows platform, how to use Windows Azure Web Site and Windows Azure Cloud Service worker role to host our Node.js application. I also described how to work with other services provided by Windows Azure platform through Windows Azure SDK for Node.js. Node.js is a very new and young network application platform. But since it’s very simple and easy to learn and deploy, as well as, it utilizes single thread non-blocking IO model, Node.js became more and more popular on web application and web service development especially for those IO sensitive projects. And as Node.js is very good at scaling-out, it’s more useful on cloud computing platform. Use Node.js on Windows platform is new, too. The modules for SQL database and Windows Azure SDK are still under development and enhancement. It doesn’t support SQL parameter in “node-sqlserver”. It does support using storage connection string to create the storage client in “azure”. But Microsoft is working on make them easier to use, working on add more features and functionalities.   PS, you can download the source code here. You can download the source code of my “Copy all always” tool here.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Unable to install Xdebug

    - by burnt1ce
    I've registered xdebug in php.ini (as per http://xdebug.org/docs/install) but it's not showing up when i run "php -m" or when i get a test page to run "phpinfo()". I've just installed the latest version of XAMPP. Can anyone provide any suggestions in getting xdebug to show up? This is what i get when i run phpinfo(). **PHP Version 5.3.1** System Windows NT ANDREW_LAPTOP 5.1 build 2600 (Windows XP Professional Service Pack 3) i586 Build Date Nov 20 2009 17:20:57 Compiler MSVC6 (Visual C++ 6.0) Architecture x86 Configure Command cscript /nologo configure.js "--enable-snapshot-build" Server API Apache 2.0 Handler Virtual Directory Support enabled Configuration File (php.ini) Path no value Loaded Configuration File C:\xampp\php\php.ini Scan this dir for additional .ini files (none) Additional .ini files parsed (none) PHP API 20090626 PHP Extension 20090626 Zend Extension 220090626 Zend Extension Build API220090626,TS,VC6 PHP Extension Build API20090626,TS,VC6 Debug Build no Thread Safety enabled Zend Memory Manager enabled Zend Multibyte Support disabled IPv6 Support enabled Registered PHP Streams https, ftps, php, file, glob, data, http, ftp, compress.zlib, compress.bzip2, phar, zip Registered Stream Socket Transports tcp, udp, ssl, sslv3, sslv2, tls Registered Stream Filters convert.iconv.*, string.rot13, string.toupper, string.tolower, string.strip_tags, convert.*, consumed, dechunk, zlib.*, bzip2.* This program makes use of the Zend Scripting Language Engine: Zend Engine v2.3.0, Copyright (c) 1998-2009 Zend Technologies PHP Credits Configuration apache2handler Apache Version Apache/2.2.14 (Win32) DAV/2 mod_ssl/2.2.14 OpenSSL/0.9.8l mod_autoindex_color PHP/5.3.1 mod_apreq2-20090110/2.7.1 mod_perl/2.0.4 Perl/v5.10.1 Apache API Version 20051115 Server Administrator postmaster@localhost Hostname:Port localhost:80 Max Requests Per Child: 0 - Keep Alive: on - Max Per Connection: 100 Timeouts Connection: 300 - Keep-Alive: 5 Virtual Server No Server Root C:/xampp/apache Loaded Modules core mod_win32 mpm_winnt http_core mod_so mod_actions mod_alias mod_asis mod_auth_basic mod_auth_digest mod_authn_default mod_authn_file mod_authz_default mod_authz_groupfile mod_authz_host mod_authz_user mod_cgi mod_dav mod_dav_fs mod_dav_lock mod_dir mod_env mod_headers mod_include mod_info mod_isapi mod_log_config mod_mime mod_negotiation mod_rewrite mod_setenvif mod_ssl mod_status mod_autoindex_color mod_php5 mod_perl mod_apreq2 Directive Local Value Master Value engine 1 1 last_modified 0 0 xbithack 0 0 Apache Environment Variable Value MIBDIRS C:/xampp/php/extras/mibs MYSQL_HOME C:\xampp\mysql\bin OPENSSL_CONF C:/xampp/apache/bin/openssl.cnf PHP_PEAR_SYSCONF_DIR C:\xampp\php PHPRC C:\xampp\php TMP C:\xampp\tmp HTTP_HOST localhost HTTP_CONNECTION keep-alive HTTP_USER_AGENT Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.8 Safari/533.2 HTTP_CACHE_CONTROL max-age=0 HTTP_ACCEPT application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 HTTP_ACCEPT_ENCODING gzip,deflate,sdch HTTP_ACCEPT_LANGUAGE en-US,en;q=0.8 HTTP_ACCEPT_CHARSET ISO-8859-1,utf-8;q=0.7,*;q=0.3 PATH C:\Documents and Settings\Andrew\Local Settings\Application Data\Google\Chrome\Application;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;c:\Program Files\Microsoft SQL Server\100\Tools\Binn\;c:\Program Files\Microsoft SQL Server\100\DTS\Binn\;C:\Program Files\QuickTime\QTSystem\;C:\Program Files\Common Files\DivX Shared\;C:\Program Files\WiTopia.Net\bin SystemRoot C:\WINDOWS COMSPEC C:\WINDOWS\system32\cmd.exe PATHEXT .COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH WINDIR C:\WINDOWS SERVER_SIGNATURE <address>Apache/2.2.14 (Win32) DAV/2 mod_ssl/2.2.14 OpenSSL/0.9.8l mod_autoindex_color PHP/5.3.1 mod_apreq2-20090110/2.7.1 mod_perl/2.0.4 Perl/v5.10.1 Server at localhost Port 80</address> SERVER_SOFTWARE Apache/2.2.14 (Win32) DAV/2 mod_ssl/2.2.14 OpenSSL/0.9.8l mod_autoindex_color PHP/5.3.1 mod_apreq2-20090110/2.7.1 mod_perl/2.0.4 Perl/v5.10.1 SERVER_NAME localhost SERVER_ADDR 127.0.0.1 SERVER_PORT 80 REMOTE_ADDR 127.0.0.1 DOCUMENT_ROOT C:/xampp/htdocs SERVER_ADMIN postmaster@localhost SCRIPT_FILENAME C:/xampp/htdocs/test.php REMOTE_PORT 3275 GATEWAY_INTERFACE CGI/1.1 SERVER_PROTOCOL HTTP/1.1 REQUEST_METHOD GET QUERY_STRING no value REQUEST_URI /test.php SCRIPT_NAME /test.php HTTP Headers Information HTTP Request Headers HTTP Request GET /test.php HTTP/1.1 Host localhost Connection keep-alive User-Agent Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.8 Safari/533.2 Cache-Control max-age=0 Accept application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 Accept-Encoding gzip,deflate,sdch Accept-Language en-US,en;q=0.8 Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.3 HTTP Response Headers X-Powered-By PHP/5.3.1 Keep-Alive timeout=5, max=80 Connection Keep-Alive Transfer-Encoding chunked Content-Type text/html bcmath BCMath support enabled Directive Local Value Master Value bcmath.scale 0 0 bz2 BZip2 Support Enabled Stream Wrapper support compress.bz2:// Stream Filter support bzip2.decompress, bzip2.compress BZip2 Version 1.0.5, 10-Dec-2007 calendar Calendar support enabled com_dotnet COM support enabled DCOM support disabled .Net support enabled Directive Local Value Master Value com.allow_dcom 0 0 com.autoregister_casesensitive 1 1 com.autoregister_typelib 0 0 com.autoregister_verbose 0 0 com.code_page no value no value com.typelib_file no value no value Core PHP Version 5.3.1 Directive Local Value Master Value allow_call_time_pass_reference On On allow_url_fopen On On allow_url_include Off Off always_populate_raw_post_data Off Off arg_separator.input & & arg_separator.output &amp; &amp; asp_tags Off Off auto_append_file no value no value auto_globals_jit On On auto_prepend_file no value no value browscap C:\xampp\php\extras\browscap.ini C:\xampp\php\extras\browscap.ini default_charset no value no value default_mimetype text/html text/html define_syslog_variables Off Off disable_classes no value no value disable_functions no value no value display_errors On On display_startup_errors On On doc_root no value no value docref_ext no value no value docref_root no value no value enable_dl On On error_append_string no value no value error_log no value no value error_prepend_string no value no value error_reporting 22519 22519 exit_on_timeout Off Off expose_php On On extension_dir C:\xampp\php\ext C:\xampp\php\ext file_uploads On On highlight.bg #FFFFFF #FFFFFF highlight.comment #FF8000 #FF8000 highlight.default #0000BB #0000BB highlight.html #000000 #000000 highlight.keyword #007700 #007700 highlight.string #DD0000 #DD0000 html_errors On On ignore_repeated_errors Off Off ignore_repeated_source Off Off ignore_user_abort Off Off implicit_flush Off Off include_path .;C:\xampp\php\PEAR .;C:\xampp\php\PEAR log_errors Off Off log_errors_max_len 1024 1024 magic_quotes_gpc Off Off magic_quotes_runtime Off Off magic_quotes_sybase Off Off mail.add_x_header Off Off mail.force_extra_parameters no value no value mail.log no value no value max_execution_time 60 60 max_file_uploads 20 20 max_input_nesting_level 64 64 max_input_time 60 60 memory_limit 128M 128M open_basedir no value no value output_buffering no value no value output_handler no value no value post_max_size 128M 128M precision 14 14 realpath_cache_size 16K 16K realpath_cache_ttl 120 120 register_argc_argv On On register_globals Off Off register_long_arrays Off Off report_memleaks On On report_zend_debug On On request_order no value no value safe_mode Off Off safe_mode_exec_dir no value no value safe_mode_gid Off Off safe_mode_include_dir no value no value sendmail_from no value no value sendmail_path no value no value serialize_precision 100 100 short_open_tag Off Off SMTP localhost localhost smtp_port 25 25 sql.safe_mode Off Off track_errors Off Off unserialize_callback_func no value no value upload_max_filesize 128M 128M upload_tmp_dir C:\xampp\tmp C:\xampp\tmp user_dir no value no value user_ini.cache_ttl 300 300 user_ini.filename .user.ini .user.ini variables_order GPCS GPCS xmlrpc_error_number 0 0 xmlrpc_errors Off Off y2k_compliance On On zend.enable_gc On On ctype ctype functions enabled date date/time support enabled "Olson" Timezone Database Version 2009.18 Timezone Database internal Default timezone America/New_York Directive Local Value Master Value date.default_latitude 31.7667 31.7667 date.default_longitude 35.2333 35.2333 date.sunrise_zenith 90.583333 90.583333 date.sunset_zenith 90.583333 90.583333 date.timezone America/New_York America/New_York dom DOM/XML enabled DOM/XML API Version 20031129 libxml Version 2.7.6 HTML Support enabled XPath Support enabled XPointer Support enabled Schema Support enabled RelaxNG Support enabled ereg Regex Library System library enabled exif EXIF Support enabled EXIF Version 1.4 $Id: exif.c 287372 2009-08-16 14:32:32Z iliaa $ Supported EXIF Version 0220 Supported filetypes JPEG,TIFF Directive Local Value Master Value exif.decode_jis_intel JIS JIS exif.decode_jis_motorola JIS JIS exif.decode_unicode_intel UCS-2LE UCS-2LE exif.decode_unicode_motorola UCS-2BE UCS-2BE exif.encode_jis no value no value exif.encode_unicode ISO-8859-15 ISO-8859-15 fileinfo fileinfo support enabled version 1.0.5-dev filter Input Validation and Filtering enabled Revision $Revision: 289434 $ Directive Local Value Master Value filter.default unsafe_raw unsafe_raw filter.default_flags no value no value ftp FTP support enabled gd GD Support enabled GD Version bundled (2.0.34 compatible) FreeType Support enabled FreeType Linkage with freetype FreeType Version 2.3.11 T1Lib Support enabled GIF Read Support enabled GIF Create Support enabled JPEG Support enabled libJPEG Version 7 PNG Support enabled libPNG Version 1.2.40 WBMP Support enabled XBM Support enabled JIS-mapped Japanese Font Support enabled Directive Local Value Master Value gd.jpeg_ignore_warning 0 0 gettext GetText Support enabled hash hash support enabled Hashing Engines md2 md4 md5 sha1 sha224 sha256 sha384 sha512 ripemd128 ripemd160 ripemd256 ripemd320 whirlpool tiger128,3 tiger160,3 tiger192,3 tiger128,4 tiger160,4 tiger192,4 snefru snefru256 gost adler32 crc32 crc32b salsa10 salsa20 haval128,3 haval160,3 haval192,3 haval224,3 haval256,3 haval128,4 haval160,4 haval192,4 haval224,4 haval256,4 haval128,5 haval160,5 haval192,5 haval224,5 haval256,5 iconv iconv support enabled iconv implementation "libiconv" iconv library version 1.13 Directive Local Value Master Value iconv.input_encoding ISO-8859-1 ISO-8859-1 iconv.internal_encoding ISO-8859-1 ISO-8859-1 iconv.output_encoding ISO-8859-1 ISO-8859-1 imap IMAP c-Client Version 2007e SSL Support enabled json json support enabled json version 1.2.1 libxml libXML support active libXML Compiled Version 2.7.6 libXML Loaded Version 20706 libXML streams enabled mbstring Multibyte Support enabled Multibyte string engine libmbfl HTTP input encoding translation disabled mbstring extension makes use of "streamable kanji code filter and converter", which is distributed under the GNU Lesser General Public License version 2.1. Multibyte (japanese) regex support enabled Multibyte regex (oniguruma) version 4.7.1 Directive Local Value Master Value mbstring.detect_order no value no value mbstring.encoding_translation Off Off mbstring.func_overload 0 0 mbstring.http_input pass pass mbstring.http_output pass pass mbstring.http_output_conv_mimetypes ^(text/|application/xhtml\+xml) ^(text/|application/xhtml\+xml) mbstring.internal_encoding no value no value mbstring.language neutral neutral mbstring.strict_detection Off Off mbstring.substitute_character no value no value mcrypt mcrypt support enabled Version 2.5.8 Api No 20021217 Supported ciphers cast-128 gost rijndael-128 twofish arcfour cast-256 loki97 rijndael-192 saferplus wake blowfish-compat des rijndael-256 serpent xtea blowfish enigma rc2 tripledes Supported modes cbc cfb ctr ecb ncfb nofb ofb stream Directive Local Value Master Value mcrypt.algorithms_dir no value no value mcrypt.modes_dir no value no value mhash MHASH support Enabled MHASH API Version Emulated Support ming Ming SWF output library enabled Version 0.4.3 mysql MySQL Support enabled Active Persistent Links 0 Active Links 0 Client API version 5.1.41 Directive Local Value Master Value mysql.allow_local_infile On On mysql.allow_persistent On On mysql.connect_timeout 60 60 mysql.default_host no value no value mysql.default_password no value no value mysql.default_port 3306 3306 mysql.default_socket MySQL MySQL mysql.default_user no value no value mysql.max_links Unlimited Unlimited mysql.max_persistent Unlimited Unlimited mysql.trace_mode Off Off mysqli MysqlI Support enabled Client API library version 5.1.41 Active Persistent Links 0 Inactive Persistent Links 0 Active Links 0 Client API header version 5.1.41 MYSQLI_SOCKET MySQL Directive Local Value Master Value mysqli.allow_local_infile On On mysqli.allow_persistent On On mysqli.default_host no value no value mysqli.default_port 3306 3306 mysqli.default_pw no value no value mysqli.default_socket MySQL MySQL mysqli.default_user no value no value mysqli.max_links Unlimited Unlimited mysqli.max_persistent Unlimited Unlimited mysqli.reconnect Off Off mysqlnd mysqlnd enabled Version mysqlnd 5.0.5-dev - 081106 - $Revision: 289630 $ Command buffer size 4096 Read buffer size 32768 Read timeout 31536000 Collecting statistics Yes Collecting memory statistics No Client statistics bytes_sent 0 bytes_received 0 packets_sent 0 packets_received 0 protocol_overhead_in 0 protocol_overhead_out 0 bytes_received_ok_packet 0 bytes_received_eof_packet 0 bytes_received_rset_header_packet 0 bytes_received_rset_field_meta_packet 0 bytes_received_rset_row_packet 0 bytes_received_prepare_response_packet 0 bytes_received_change_user_packet 0 packets_sent_command 0 packets_received_ok 0 packets_received_eof 0 packets_received_rset_header 0 packets_received_rset_field_meta 0 packets_received_rset_row 0 packets_received_prepare_response 0 packets_received_change_user 0 result_set_queries 0 non_result_set_queries 0 no_index_used 0 bad_index_used 0 slow_queries 0 buffered_sets 0 unbuffered_sets 0 ps_buffered_sets 0 ps_unbuffered_sets 0 flushed_normal_sets 0 flushed_ps_sets 0 ps_prepared_never_executed 0 ps_prepared_once_executed 0 rows_fetched_from_server_normal 0 rows_fetched_from_server_ps 0 rows_buffered_from_client_normal 0 rows_buffered_from_client_ps 0 rows_fetched_from_client_normal_buffered 0 rows_fetched_from_client_normal_unbuffered 0 rows_fetched_from_client_ps_buffered 0 rows_fetched_from_client_ps_unbuffered 0 rows_fetched_from_client_ps_cursor 0 rows_skipped_normal 0 rows_skipped_ps 0 copy_on_write_saved 0 copy_on_write_performed 0 command_buffer_too_small 0 connect_success 0 connect_failure 0 connection_reused 0 reconnect 0 pconnect_success 0 active_connections 0 active_persistent_connections 0 explicit_close 0 implicit_close 0 disconnect_close 0 in_middle_of_command_close 0 explicit_free_result 0 implicit_free_result 0 explicit_stmt_close 0 implicit_stmt_close 0 mem_emalloc_count 0 mem_emalloc_ammount 0 mem_ecalloc_count 0 mem_ecalloc_ammount 0 mem_erealloc_count 0 mem_erealloc_ammount 0 mem_efree_count 0 mem_malloc_count 0 mem_malloc_ammount 0 mem_calloc_count 0 mem_calloc_ammount 0 mem_realloc_count 0 mem_realloc_ammount 0 mem_free_count 0 proto_text_fetched_null 0 proto_text_fetched_bit 0 proto_text_fetched_tinyint 0 proto_text_fetched_short 0 proto_text_fetched_int24 0 proto_text_fetched_int 0 proto_text_fetched_bigint 0 proto_text_fetched_decimal 0 proto_text_fetched_float 0 proto_text_fetched_double 0 proto_text_fetched_date 0 proto_text_fetched_year 0 proto_text_fetched_time 0 proto_text_fetched_datetime 0 proto_text_fetched_timestamp 0 proto_text_fetched_string 0 proto_text_fetched_blob 0 proto_text_fetched_enum 0 proto_text_fetched_set 0 proto_text_fetched_geometry 0 proto_text_fetched_other 0 proto_binary_fetched_null 0 proto_binary_fetched_bit 0 proto_binary_fetched_tinyint 0 proto_binary_fetched_short 0 proto_binary_fetched_int24 0 proto_binary_fetched_int 0 proto_binary_fetched_bigint 0 proto_binary_fetched_decimal 0 proto_binary_fetched_float 0 proto_binary_fetched_double 0 proto_binary_fetched_date 0 proto_binary_fetched_year 0 proto_binary_fetched_time 0 proto_binary_fetched_datetime 0 proto_binary_fetched_timestamp 0 proto_binary_fetched_string 0 proto_binary_fetched_blob 0 proto_binary_fetched_enum 0 proto_binary_fetched_set 0 proto_binary_fetched_geometry 0 proto_binary_fetched_other 0 init_command_executed_count 0 init_command_failed_count 0 odbc ODBC Support enabled Active Persistent Links 0 Active Links 0 ODBC library Win32 Directive Local Value Master Value odbc.allow_persistent On On odbc.check_persistent On On odbc.default_cursortype Static cursor Static cursor odbc.default_db no value no value odbc.default_pw no value no value odbc.default_user no value no value odbc.defaultbinmode return as is return as is odbc.defaultlrl return up to 4096 bytes return up to 4096 bytes odbc.max_links Unlimited Unlimited odbc.max_persistent Unlimited Unlimited openssl OpenSSL support enabled OpenSSL Library Version OpenSSL 0.9.8l 5 Nov 2009 OpenSSL Header Version OpenSSL 0.9.8l 5 Nov 2009 pcre PCRE (Perl Compatible Regular Expressions) Support enabled PCRE Library Version 8.00 2009-10-19 Directive Local Value Master Value pcre.backtrack_limit 100000 100000 pcre.recursion_limit 100000 100000 pdf PDF Support enabled PDFlib GmbH Version 7.0.4p4 PECL Version 2.1.6 Revision $Revision: 277110 $ PDO PDO support enabled PDO drivers mysql, odbc, sqlite, sqlite2 pdo_mysql PDO Driver for MySQL enabled Client API version 5.1.41 PDO_ODBC PDO Driver for ODBC (Win32) enabled ODBC Connection Pooling Enabled, strict matching pdo_sqlite PDO Driver for SQLite 3.x enabled SQLite Library 3.6.20 Phar Phar: PHP Archive support enabled Phar EXT version 2.0.1 Phar API version 1.1.1 CVS revision $Revision: 286338 $ Phar-based phar archives enabled Tar-based phar archives enabled ZIP-based phar archives enabled gzip compression enabled bzip2 compression enabled Native OpenSSL support enabled Phar based on pear/PHP_Archive, original concept by Davey Shafik. Phar fully realized by Gregory Beaver and Marcus Boerger. Portions of tar implementation Copyright (c) 2003-2009 Tim Kientzle. Directive Local Value Master Value phar.cache_list no value no value phar.readonly On On phar.require_hash On On Reflection Reflection enabled Version $Revision: 287991 $ session Session Support enabled Registered save handlers files user sqlite Registered serializer handlers php php_binary wddx Directive Local Value Master Value session.auto_start Off Off session.bug_compat_42 On On session.bug_compat_warn On On session.cache_expire 180 180 session.cache_limiter nocache nocache session.cookie_domain no value no value session.cookie_httponly Off Off session.cookie_lifetime 0 0 session.cookie_path / / session.cookie_secure Off Off session.entropy_file no value no value session.entropy_length 0 0 session.gc_divisor 100 100 session.gc_maxlifetime 1440 1440 session.gc_probability 1 1 session.hash_bits_per_character 5 5 session.hash_function 0 0 session.name PHPSESSID PHPSESSID session.referer_check no value no value session.save_handler files files session.save_path C:\xampp\tmp C:\xampp\tmp session.serialize_handler php php session.use_cookies On On session.use_only_cookies Off Off session.use_trans_sid 0 0 SimpleXML Simplexml support enabled Revision $Revision: 281953 $ Schema support enabled soap Soap Client enabled Soap Server enabled Directive Local Value Master Value soap.wsdl_cache 1 1 soap.wsdl_cache_dir /tmp /tmp soap.wsdl_cache_enabled 1 1 soap.wsdl_cache_limit 5 5 soap.wsdl_cache_ttl 86400 86400 sockets Sockets Support enabled SPL SPL support enabled Interfaces Countable, OuterIterator, RecursiveIterator, SeekableIterator, SplObserver, SplSubject Classes AppendIterator, ArrayIterator, ArrayObject, BadFunctionCallException, BadMethodCallException, CachingIterator, DirectoryIterator, DomainException, EmptyIterator, FilesystemIterator, FilterIterator, GlobIterator, InfiniteIterator, InvalidArgumentException, IteratorIterator, LengthException, LimitIterator, LogicException, MultipleIterator, NoRewindIterator, OutOfBoundsException, OutOfRangeException, OverflowException, ParentIterator, RangeException, RecursiveArrayIterator, RecursiveCachingIterator, RecursiveDirectoryIterator, RecursiveFilterIterator, RecursiveIteratorIterator, RecursiveRegexIterator, RecursiveTreeIterator, RegexIterator, RuntimeException, SplDoublyLinkedList, SplFileInfo, SplFileObject, SplFixedArray, SplHeap, SplMinHeap, SplMaxHeap, SplObjectStorage, SplPriorityQueue, SplQueue, SplStack, SplTempFileObject, UnderflowException, UnexpectedValueException SQLite SQLite support enabled PECL Module version 2.0-dev $Id: sqlite.c 289598 2009-10-12 22:37:52Z pajoye $ SQLite Library 2.8.17 SQLite Encoding iso8859 Directive Local Value Master Value sqlite.assoc_case 0 0 sqlite3 SQLite3 support enabled SQLite3 module version 0.7-dev SQLite Library 3.6.20 Directive Local Value Master Value sqlite3.extension_dir no value no value standard Dynamic Library Support enabled Internal Sendmail Support for Windows enabled Directive Local Value Master Value assert.active 1 1 assert.bail 0 0 assert.callback no value no value assert.quiet_eval 0 0 assert.warning 1 1 auto_detect_line_endings 0 0 default_socket_timeout 60 60 safe_mode_allowed_env_vars PHP_ PHP_ safe_mode_protected_env_vars LD_LIBRARY_PATH LD_LIBRARY_PATH url_rewriter.tags a=href,area=href,frame=src,input=src,form=,fieldset= a=href,area=href,frame=src,input=src,form=,fieldset= user_agent no value no value tokenizer Tokenizer Support enabled wddx WDDX Support enabled WDDX Session Serializer enabled xml XML Support active XML Namespace Support active libxml2 Version 2.7.6 xmlreader XMLReader enabled xmlrpc core library version xmlrpc-epi v. 0.54 php extension version 0.51 author Dan Libby homepage http://xmlrpc-epi.sourceforge.net open sourced by Epinions.com xmlwriter XMLWriter enabled xsl XSL enabled libxslt Version 1.1.26 libxslt compiled against libxml Version 2.7.6 EXSLT enabled libexslt Version 1.1.26 zip Zip enabled Extension Version $Id: php_zip.c 276389 2009-02-24 23:55:14Z iliaa $ Zip version 1.9.1 Libzip version 0.9.0 zlib ZLib Support enabled Stream Wrapper support compress.zlib:// Stream Filter support zlib.inflate, zlib.deflate Compiled Version 1.2.3 Linked Version 1.2.3 Directive Local Value Master Value zlib.output_compression Off Off zlib.output_compression_level -1 -1 zlib.output_handler no value no value Additional Modules Module Name Environment Variable Value no value ::=::\ no value C:=C:\xampp ALLUSERSPROFILE C:\Documents and Settings\All Users APPDATA C:\Documents and Settings\Andrew\Application Data CHROME_RESTART Google Chrome|Whoa! Google Chrome has crashed. Restart now?|LEFT_TO_RIGHT CHROME_VERSION 5.0.342.8 CLASSPATH .;C:\Program Files\QuickTime\QTSystem\QTJava.zip CommonProgramFiles C:\Program Files\Common Files COMPUTERNAME ANDREW_LAPTOP ComSpec C:\WINDOWS\system32\cmd.exe FP_NO_HOST_CHECK NO HOMEDRIVE C: HOMEPATH \Documents and Settings\Andrew LOGONSERVER \\ANDREW_LAPTOP NUMBER_OF_PROCESSORS 2 OS Windows_NT PATH C:\Documents and Settings\Andrew\Local Settings\Application Data\Google\Chrome\Application;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;c:\Program Files\Microsoft SQL Server\100\Tools\Binn\;c:\Program Files\Microsoft SQL Server\100\DTS\Binn\;C:\Program Files\QuickTime\QTSystem\;C:\Program Files\Common Files\DivX Shared\;C:\Program Files\WiTopia.Net\bin PATHEXT .COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH PROCESSOR_ARCHITECTURE x86 PROCESSOR_IDENTIFIER x86 Family 6 Model 15 Stepping 10, GenuineIntel PROCESSOR_LEVEL 6 PROCESSOR_REVISION 0f0a ProgramFiles C:\Program Files PROMPT $P$G QTJAVA C:\Program Files\QuickTime\QTSystem\QTJava.zip SESSIONNAME Console sfxcmd "C:\Documents and Settings\Andrew\My Documents\Downloads\xampp-win32-1.7.3.exe" sfxname C:\Documents and Settings\Andrew\My Documents\Downloads\xampp-win32-1.7.3.exe SystemDrive C: SystemRoot C:\WINDOWS TEMP C:\DOCUME~1\Andrew\LOCALS~1\Temp TMP C:\DOCUME~1\Andrew\LOCALS~1\Temp USERDOMAIN ANDREW_LAPTOP USERNAME Andrew USERPROFILE C:\Documents and Settings\Andrew VS100COMNTOOLS C:\Program Files\Microsoft Visual Studio 10.0\Common7\Tools\ windir C:\WINDOWS AP_PARENT_PID 2216 PHP Variables Variable Value _SERVER["MIBDIRS"] C:/xampp/php/extras/mibs _SERVER["MYSQL_HOME"] C:\xampp\mysql\bin _SERVER["OPENSSL_CONF"] C:/xampp/apache/bin/openssl.cnf _SERVER["PHP_PEAR_SYSCONF_DIR"] C:\xampp\php _SERVER["PHPRC"] C:\xampp\php _SERVER["TMP"] C:\xampp\tmp _SERVER["HTTP_HOST"] localhost _SERVER["HTTP_CONNECTION"] keep-alive _SERVER["HTTP_USER_AGENT"] Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.8 Safari/533.2 _SERVER["HTTP_CACHE_CONTROL"] max-age=0 _SERVER["HTTP_ACCEPT"] application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 _SERVER["HTTP_ACCEPT_ENCODING"] gzip,deflate,sdch _SERVER["HTTP_ACCEPT_LANGUAGE"] en-US,en;q=0.8 _SERVER["HTTP_ACCEPT_CHARSET"] ISO-8859-1,utf-8;q=0.7,*;q=0.3 _SERVER["PATH"] C:\Documents and Settings\Andrew\Local Settings\Application Data\Google\Chrome\Application;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;c:\Program Files\Microsoft SQL Server\100\Tools\Binn\;c:\Program Files\Microsoft SQL Server\100\DTS\Binn\;C:\Program Files\QuickTime\QTSystem\;C:\Program Files\Common Files\DivX Shared\;C:\Program Files\WiTopia.Net\bin _SERVER["SystemRoot"] C:\WINDOWS _SERVER["COMSPEC"] C:\WINDOWS\system32\cmd.exe _SERVER["PATHEXT"] .COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH _SERVER["WINDIR"] C:\WINDOWS _SERVER["SERVER_SIGNATURE"] <address>Apache/2.2.14 (Win32) DAV/2 mod_ssl/2.2.14 OpenSSL/0.9.8l mod_autoindex_color PHP/5.3.1 mod_apreq2-20090110/2.7.1 mod_perl/2.0.4 Perl/v5.10.1 Server at localhost Port 80</address> _SERVER["SERVER_SOFTWARE"] Apache/2.2.14 (Win32) DAV/2 mod_ssl/2.2.14 OpenSSL/0.9.8l mod_autoindex_color PHP/5.3.1 mod_apreq2-20090110/2.7.1 mod_perl/2.0.4 Perl/v5.10.1 _SERVER["SERVER_NAME"] localhost _SERVER["SERVER_ADDR"] 127.0.0.1 _SERVER["SERVER_PORT"] 80 _SERVER["REMOTE_ADDR"] 127.0.0.1 _SERVER["DOCUMENT_ROOT"] C:/xampp/htdocs _SERVER["SERVER_ADMIN"] postmaster@localhost _SERVER["SCRIPT_FILENAME"] C:/xampp/htdocs/test.php _SERVER["REMOTE_PORT"] 3275 _SERVER["GATEWAY_INTERFACE"] CGI/1.1 _SERVER["SERVER_PROTOCOL"] HTTP/1.1 _SERVER["REQUEST_METHOD"] GET _SERVER["QUERY_STRING"] no value _SERVER["REQUEST_URI"] /test.php _SERVER["SCRIPT_NAME"] /test.php _SERVER["PHP_SELF"] /test.php _SERVER["REQUEST_TIME"] 1270600868 _SERVER["argv"] Array ( ) _SERVER["argc"] 0 PHP License This program is free software; you can redistribute it and/or modify it under the terms of the PHP License as published by the PHP Group and included in the distribution in the file: LICENSE This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. If you did not receive a copy of the PHP license, or have any questions about PHP licensing, please contact [email protected].

    Read the article

  • Multitask Like a Pro with AquaSnap

    - by Matthew Guay
    Are you tired of shuffling back and forth between windows?  Here’s a handy app that can help you keep all of your windows organized and accessible. AquaSnap is a great free utility that helps you use multiple windows at the same time easily and efficiently.  One of Windows 7’s greatest new features is Aero Snap, which lets you easily view windows side by side by simply dragging windows to side of your screen.  After using Windows 7 for the past year, Aero Snap is one of the features we really miss when using older versions of Windows. With AquaSnap, you now have all of the features of Aero Snap and more in Windows 2000, XP, Vista, and of course Windows 7.  Not only does it give you Aero Snap features, but AquaSnap also gives you more control over your windows to make you more productive. Getting Started AquaSnap is a a free download for Windows 2000, XP, Vista, and 7.  Download the small installer (link below) and install it with the default settings. AquaSnap automatically runs as soon as it is installed, and you will notice a new icon in your system tray. Now you can go ahead and put it to use.  Drag a window to any edge or corner of your desktop, and you will see an icon showing what part of the screen the window will cover. Dragging it to the side of the screen expanded the window to fill the right half of the screen, just like the default Aero Snap in Windows 7.  You can drag the window away to restore it to its former size. AquaSnap works on any corner of the screen too, so you can have 4 windows side-by-side.  We already have 3 windows snapped to the corners, and notice that we’re dragging a fourth window to the bottom right corner. You can also snap windows to the bottom and top of the screen.  Here we have Word snapped to the bottom half of the screen, and we’re dragging Chrome to the top. You can even snap internal windows in Multiple Document Interface (MDI) programs such as Excel.  Here we are snapping a workbook in Excel to the left to view 2 workbooks side-by-side.   Additionally, AquaSnap lets you keep any window always on top.  Simply shake any window, and it will turn semi-transparent and stay on top of all other windows.  Notice the transparent calculator here on top of Excel. All of AquaSnap’s features work great in Windows 2000, XP, and Vista too.  Here we are snapping IE6 to the left of the screen in XP. Here are 3 windows snapped to the sides in XP.  You can mix the snap modes, and have, for instance, two windows on the right side and one window on the left.  This is a great way to maximize productivity if you need more space in one of the windows. Even AquaShake works to keep a window transparent and on top in XP. Settings AquaSnap has a detailed settings dialog where you can tweak it to work exactly like you want.  Simply right-click on its icon in the taskbar, and select Settings. From the first screen, you can choose if you want AquaSnap to start with Windows, and if you want it to show an icon in the system tray.  If you turn off the system tray icon, you can access the AquaSnap settings from Start > All Programs > AquaSnap > Configuration (or simply search for Configuration in Vista or Windows 7). The second tab in settings lets you choose what you want each snapping region to do.  You can also choose two other presets, including AeroSnap (which works just like the default Aero Snap in Windows 7) and AquaSnap simple (which only snaps at the edges of the screen, not the corners). The third tab lets you increase or decrease the opacity of pinned windows when using AquaShake, and also lets you increase or decrease the shaking sensitivity.  Additionally, if you prefer the standard AeroShake functionality, which minimizes all other open windows when you shake a window, you can choose that too. The fourth tab lets you activate an optional feature, AquaGlass.  If you activate this, it will make windows turn transparent when you drag them across the screen.   Finally, the last tab lets you change the color and opacity of the preview rectangle, or simply turn it off. Or, if you want to temporarily turn AquaSnap off, simply right-click on its icon and select Off.  In Windows 7, turning off AquaSnap will restore your standard Windows Aero Snap functionality, and in other version of Windows it will stop letting you snap windows at all.  You can then repeat the steps and select On when you want to use AquaSnap again. Conclusion AquaSnap is a handy tool to make you more productive at your computer.  With a wide variety of useful features, there’s something here for everyone.  Download AquaSnap Similar Articles Productive Geek Tips How to Get Virtual Desktops on Windows XP TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Out of band Security Update for Internet Explorer 7 Cool Looking Screensavers for Windows SyncToy syncs Files and Folders across Computers on a Network (or partitions on the same drive) If it were only this easy Classic Cinema Online offers 100’s of OnDemand Movies OutSync will Sync Photos of your Friends on Facebook and Outlook

    Read the article

  • LLBLGen Pro v3.5 has been released!

    - by FransBouma
    Last weekend we released LLBLGen Pro v3.5! Below the list of what's new in this release. Of course, not everything is on this list, like the large amount of work we put in refactoring the runtime framework. The refactoring was necessary because our framework has two paradigms which are added to the framework at a different time, and from a design perspective in the wrong order (the paradigm we added first, SelfServicing, should have been built on top of Adapter, the other paradigm, which was added more than a year after the first released version). The refactoring made sure the framework re-uses more code across the two paradigms (they already shared a lot of code) and is better prepared for the future. We're not done yet, but refactoring a massive framework like ours without breaking interfaces and existing applications is ... a bit of a challenge ;) To celebrate the release of v3.5, we give every customer a 30% discount! Use the coupon code NR1ORM with your order :) The full list of what's new: Designer Rule based .NET Attribute definitions. It's now possible to specify a rule using fine-grained expressions with an attribute definition to define which elements of a given type will receive the attribute definition. Rules can be assigned to attribute definitions on the project level, to make it even easier to define attribute definitions in bulk for many elements in the project. More information... Revamped Project Settings dialog. Multiple project related properties and settings dialogs have been merged into a single dialog called Project Settings, which makes it easier to configure the various settings related to project elements. It also makes it easier to find features previously not used  by many (e.g. type conversions) More information... Home tab with Quick Start Guides. To make new users feel right at home, we added a home tab with quick start guides which guide you through four main use cases of the designer. System Type Converters. Many common conversions have been implemented by default in system type converters so users don't have to develop their own type converters anymore for these type conversions. Bulk Element Setting Manipulator. To change setting values for multiple project elements, it was a little cumbersome to do that without a lot of clicking and opening various editors. This dialog makes changing settings for multiple elements very easy. EDMX Importer. It's now possible to import entity model data information from an existing Entity Framework EDMX file. Other changes and fixes See for the full list of changes and fixes the online documentation. LLBLGen Pro Runtime Framework WCF Data Services (OData) support has been added. It's now possible to use your LLBLGen Pro runtime framework powered domain layer in a WCF Data Services application using the VS.NET tools for WCF Data Services. WCF Data Services is a Microsoft technology for .NET 4 to expose your domain model using OData. More information... New query specification and execution API: QuerySpec. QuerySpec is our new query specification and execution API as an alternative to Linq and our more low-level API. It's build, like our Linq provider, on top of our lower-level API. More information... SQL Server 2012 support. The SQL Server DQE allows paging using the new SQL Server 2012 style. More information... System Type converters. For a common set of types the LLBLGen Pro runtime framework contains built-in type conversions so you don't need to write your own type converters anymore. Public/NonPublic property support. It's now possible to mark a field / navigator as non-public which is reflected in the runtime framework as an internal/friend property instead of a public property. This way you can hide properties from the public interface of a generated class and still access it through code added to the generated code base. FULL JOIN support. It's now possible to perform FULL JOIN joins using the native query api and QuerySpec. It's left to the developer to check whether the used target database supports FULL (OUTER) JOINs. Using a FULL JOIN with entity fetches is not recommended, and should only be used when both participants in the join aren't the target of the fetch. Dependency Injection Tracing. It's now possible to enable tracing on dependency injection. Enable tracing at level '4' on the traceswitch 'ORMGeneral'. This will emit trace information about which instance of which type got an instance of type T injected into property P. Entity Instances in projections in Linq. It's now possible to return an entity instance in a custom Linq projection. It's now also possible to pass this instance to a method inside the query projection. Inheritance fully supported in this construct. Entity Framework support The Entity Framework has been updated in the recent year with code-first support and a new simpler context api: DbContext (with DbSet). The amount of code to generate is smaller and the context simpler. LLBLGen Pro v3.5 comes with support for DbContext and DbSet and generates code which utilizes these new classes. NHibernate support NHibernate v3.2+ built-in proxy factory factory support. By default the built-in ProxyFactoryFactory is selected. FluentNHibernate Session Manager uses 1.2 syntax. Fluent NHibernate mappings generate a SessionManager which uses the v1.2 syntax for the ProxyFactoryFactory location Optionally emit schema / catalog name in mappings Two settings have been added which allow the user to control whether the catalog name and/or schema name as known in the project in the designer is emitted into the mappings.

    Read the article

  • Persistent Storage in Windows Phone 7

    - by Richard Jones
    Mark Arteaga – fellow MVP just helped me with this,  thought I’d share. Windows Phone 7 SilverLight supports persistent storage, which is a great way of saving settings betweens runs of your application. However I couldn’t get it to work. If you do this to create a persistant storage instance -    private IsolatedStorageSettings userSettings = IsolatedStorageSettings.ApplicationSettings; To retrieve settings do the following - string xx = userSettings[“SomeValue”].ToString();   Hiowever the bit that got me was how to save settings, you have todo this - userSettings.Add("Hello", DateTime.Now.ToShortTimeString());           userSettings.Save(); // <- This is the bit that got me   Hope this helps others

    Read the article

  • Ubuntu Oneiric + Awesome WM

    - by janjust
    I upgraded to oneiric ocelot, running awesome wm. Everything works, more or less, fine but one thing I've noticed is that now my menu fonts and my menu symbols are larger than I'd like them to be. I used to set them in font settings, but now (for one I don't even know where font settings are anymore, I tried gnome-tweak-tool) the font-settings are gone? Surely I'm missing something. My prime example is the program evince whose symbols are ginormous. Any hints how to tweak it?

    Read the article

  • ResponseStatusLine protocol violation

    - by Tom Hines
    I parse/scrape a few web page every now and then and recently ran across an error that stated: "The server committed a protocol violation. Section=ResponseStatusLine".   After a few web searches, I found a couple of suggestions – one of which said the problem could be fixed by changing the HttpWebRequest ProtocolVersion to 1.0 with the command: 1: HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(strURI); 2: req.ProtocolVersion = HttpVersion.Version10;   …but that did not work in my particular case.   What DID work was the next suggestion I found that suggested the use of the setting: “useUnsafeHeaderParsing” either in the app.config file or programmatically. If added to the app.config, it would be: 1: <!-- after the applicationSettings --> 2: <system.net> 3: <settings> 4: <httpWebRequest useUnsafeHeaderParsing ="true"/> 5: </settings> 6: </system.net>   If done programmatically, it would look like this: C++: 1: // UUHP_CPP.h 2: #pragma once 3: using namespace System; 4: using namespace System::Reflection; 5:   6: namespace UUHP_CPP 7: { 8: public ref class CUUHP_CPP 9: { 10: public: 11: static bool UseUnsafeHeaderParsing(String^% strError) 12: { 13: Assembly^ assembly = Assembly::GetAssembly(System::Net::Configuration::SettingsSection::typeid); //__typeof 14: if (nullptr==assembly) 15: { 16: strError = "Could not access Assembly"; 17: return false; 18: } 19:   20: Type^ type = assembly->GetType("System.Net.Configuration.SettingsSectionInternal"); 21: if (nullptr==type) 22: { 23: strError = "Could not access internal settings"; 24: return false; 25: } 26:   27: Object^ obj = type->InvokeMember("Section", 28: BindingFlags::Static | BindingFlags::GetProperty | BindingFlags::NonPublic, 29: nullptr, nullptr, gcnew array<Object^,1>(0)); 30:   31: if(nullptr == obj) 32: { 33: strError = "Could not invoke Section member"; 34: return false; 35: } 36:   37: FieldInfo^ fi = type->GetField("useUnsafeHeaderParsing", BindingFlags::NonPublic | BindingFlags::Instance); 38: if(nullptr == fi) 39: { 40: strError = "Could not access useUnsafeHeaderParsing field"; 41: return false; 42: } 43:   44: if (!(bool)fi->GetValue(obj)) 45: { 46: fi->SetValue(obj, true); 47: } 48:   49: return true; 50: } 51: }; 52: } C# (CSharp): 1: using System; 2: using System.Reflection; 3:   4: namespace UUHP_CS 5: { 6: public class CUUHP_CS 7: { 8: public static bool UseUnsafeHeaderParsing(ref string strError) 9: { 10: Assembly assembly = Assembly.GetAssembly(typeof(System.Net.Configuration.SettingsSection)); 11: if (null == assembly) 12: { 13: strError = "Could not access Assembly"; 14: return false; 15: } 16:   17: Type type = assembly.GetType("System.Net.Configuration.SettingsSectionInternal"); 18: if (null == type) 19: { 20: strError = "Could not access internal settings"; 21: return false; 22: } 23:   24: object obj = type.InvokeMember("Section", 25: BindingFlags.Static | BindingFlags.GetProperty | BindingFlags.NonPublic, 26: null, null, new object[] { }); 27:   28: if (null == obj) 29: { 30: strError = "Could not invoke Section member"; 31: return false; 32: } 33:   34: // If it's not already set, set it. 35: FieldInfo fi = type.GetField("useUnsafeHeaderParsing", BindingFlags.NonPublic | BindingFlags.Instance); 36: if (null == fi) 37: { 38: strError = "Could not access useUnsafeHeaderParsing field"; 39: return false; 40: } 41:   42: if (!Convert.ToBoolean(fi.GetValue(obj))) 43: { 44: fi.SetValue(obj, true); 45: } 46:   47: return true; 48: } 49: } 50: }   F# (FSharp): 1: namespace UUHP_FS 2: open System 3: open System.Reflection 4: module CUUHP_FS = 5: let UseUnsafeHeaderParsing(strError : byref<string>) : bool = 6: // 7: let assembly : Assembly = Assembly.GetAssembly(typeof<System.Net.Configuration.SettingsSection>) 8: if (null = assembly) then 9: strError <- "Could not access Assembly" 10: false 11: else 12: 13: let myType : Type = assembly.GetType("System.Net.Configuration.SettingsSectionInternal") 14: if (null = myType) then 15: strError <- "Could not access internal settings" 16: false 17: else 18: 19: let obj : Object = myType.InvokeMember("Section", BindingFlags.Static ||| BindingFlags.GetProperty ||| BindingFlags.NonPublic, null, null, Array.zeroCreate 0) 20: if (null = obj) then 21: strError <- "Could not invoke Section member" 22: false 23: else 24: 25: // If it's not already set, set it. 26: let fi : FieldInfo = myType.GetField("useUnsafeHeaderParsing", BindingFlags.NonPublic ||| BindingFlags.Instance) 27: if(null = fi) then 28: strError <- "Could not access useUnsafeHeaderParsing field" 29: false 30: else 31: 32: if (not(Convert.ToBoolean(fi.GetValue(obj)))) then 33: fi.SetValue(obj, true) 34: 35: // Now return true 36: true VB (Visual Basic): 1: Option Explicit On 2: Option Strict On 3: Imports System 4: Imports System.Reflection 5:   6: Public Class CUUHP_VB 7: Public Shared Function UseUnsafeHeaderParsing(ByRef strError As String) As Boolean 8:   9: Dim assembly As [Assembly] 10: assembly = [assembly].GetAssembly(GetType(System.Net.Configuration.SettingsSection)) 11:   12: If (assembly Is Nothing) Then 13: strError = "Could not access Assembly" 14: Return False 15: End If 16:   17: Dim type As Type 18: type = [assembly].GetType("System.Net.Configuration.SettingsSectionInternal") 19: If (type Is Nothing) Then 20: strError = "Could not access internal settings" 21: Return False 22: End If 23:   24: Dim obj As Object 25: obj = [type].InvokeMember("Section", _ 26: BindingFlags.Static Or BindingFlags.GetProperty Or BindingFlags.NonPublic, _ 27: Nothing, Nothing, New [Object]() {}) 28:   29: If (obj Is Nothing) Then 30: strError = "Could not invoke Section member" 31: Return False 32: End If 33:   34: ' If it's not already set, set it. 35: Dim fi As FieldInfo 36: fi = [type].GetField("useUnsafeHeaderParsing", BindingFlags.NonPublic Or BindingFlags.Instance) 37: If (fi Is Nothing) Then 38: strError = "Could not access useUnsafeHeaderParsing field" 39: Return False 40: End If 41:   42: If (Not Convert.ToBoolean(fi.GetValue(obj))) Then 43: fi.SetValue(obj, True) 44: End If 45:   46: Return True 47: End Function 48: End Class   Technorati Tags: C++,CPP,VB,Visual Basic,F#,FSharp,C#,CSharp,ResponseStatusLine,protocol violation

    Read the article

  • Anti-aliasing changed after update (or something...)

    - by Mussnoon
    The anti-aliasing of my system (of GTK?) has gone weird after I did one of two things - do a system update, and install gimp 2.7 beta. See images: Before: After: Before: After: Here's the current rendering comparison between Chromium, Firefox and Opera (in that order): Does anyone know how I can get the old anti-aliasing back? As far as I can tell, I never did anything special to achieve that before. It has always been on the default settings since I installed Lucid few months ago. Update: I have tried different settings (even though I knew they were already at the "best" settings) in appearance fonts ( details) but, as expected, any change there only makes things worse.

    Read the article

  • How do I change the language via a terminal?

    - by McGee
    Using system settings I changed my language to Arabic and deleted the English language from the settings. Then the computer lagged and it logged out - now I can't log back in because the login is in Arabic. So is there a way to default my language via terminal, default the login password language, or login via terminal which is still in English. I only have access to guest and a terminal. I changed the pasword to something that could be translated into arabic http://www.psychocats.net/ubuntu/resetpassword - then loged in and used system settings to default.

    Read the article

  • chromium-browser --proxy-server debugging

    - by user3678068
    Many places online have pointed out to configure chromium proxy via command can be achieve with the following line chromium-browser --proxy-server=[username]:[password]@[host]:[port] but I got this result on every request. Here's the output in the command line right after executing the previous command. (They do not appear to be relevant. There are no new command line output when I try to visit a page) libGL error: failed to authenticate magic 30 libGL error: failed to load driver: vboxvideo ATTENTION: default value of option force_s3tc_enable overridden by environment. [29551:29551:0606/160459:ERROR:sandbox_linux.cc(268)] InitializeSandbox() called with multiple threads in process gpu-process I have double checked that the proxy credential works with the foxyproxy chrome plugin. What else can I try to figure this out? [Edit] Going to chrome://net-internals/#proxy and reading "Effective proxy settings" if I do chromium-browser with no flags, I get Use DIRECT connections. Source: GSETTINGS if chromium-browser --proxy-server=[host]:[port], I get a message box requesting to login, and under "Effective proxy settings": Proxy server: [host]:[port] if chromium-browser --proxy-server=[user]:[pass]@[host]:[port], "Effective proxy settings" shows: Use DIRECT connections

    Read the article

  • Flash Webcam non responsive Ubuntu 11.10

    - by powerbuoy
    I've got the same problem as this gentleman: https://answers.launchpad.net/ubuntu/+source/flashplugin-nonfree/+question/176541 Where the webcam settings / access does not work at all / is completely unresponsive in Ubuntu 11.10. I've tried webcam access in Facebook, Google+, my own code + a number of tutorials/demos and none work. What happens is the settings dialogue is completely unresponsive. Clicking tabs or buttons does nothing. In the question linked to a suggested answer is to run Unity 2D. Unfortunately this does not work for me (the exact same thing happens). I've also tried Gnome 3 which also does not work. Note that it is only the webcam settings that don't work. YouTube videos and annoying banners work just fine. Does anyone know of a workaround for this (except going back to 11.04) or if they've fixed this in 12.04? - also, are any of you experiencing the same thing?

    Read the article

  • Jupiter seems to break multimonitor setup

    - by Philippe
    If Jupiter is in the startup my multimonitor setup will fail. Instead of two separate desktops the monitors will be mirrored, one over the other with the smaller notebook-desktop in the larger screen: After each new log-in I have to reset the Display settings in the "System Settings..." dialogue to two not mirrored screens, the changes don't seem to be saved. Given the the entry from xsession-errors /usr/lib/jupiter/scripts/resolutions: line 42: /var/jupiter/available_resolutions_LVDS: Permission denied I assumed that it's coming from Jupiter. Infact, if Jupiter is removed as startup application the desktop settings are fine. Any idea how to solve that issue?

    Read the article

  • Using Transaction Logging to Recover Post-Archived Essbase data

    - by Keith Rosenthal
    Data recovery is typically performed by restoring data from an archive.  Data added or removed since the last archive took place can also be recovered by enabling transaction logging in Essbase.  Transaction logging works by writing transactions to a log store.  The information in the log store can then be recovered by replaying the log store entries in sequence since the last archive took place.  The following information is recorded within a transaction log entry: Sequence ID Username Start Time End Time Request Type A request type can be one of the following categories: Calculations, including the default calculation as well as both server and client side calculations Data loads, including data imports as well as data loaded using a load rule Data clears as well as outline resets Locking and sending data from SmartView and the Spreadsheet Add-In.  Changes from Planning web forms are also tracked since a lock and send operation occurs during this process. You can use the Display Transactions command in the EAS console or the query database MAXL command to view the transaction log entries. Enabling Transaction Logging Transaction logging can be enabled at the Essbase server, application or database level by adding the TRANSACTIONLOGLOCATION essbase.cfg setting.  The following is the TRANSACTIONLOGLOCATION syntax: TRANSACTIONLOGLOCATION [appname [dbname]] LOGLOCATION NATIVE ENABLE | DISABLE Note that you can have multiple TRANSACTIONLOGLOCATION entries in the essbase.cfg file.  For example: TRANSACTIONLOGLOCATION Hyperion/trlog NATIVE ENABLE TRANSACTIONLOGLOCATION Sample Hyperion/trlog NATIVE DISABLE The first statement will enable transaction logging for all Essbase applications, and the second statement will disable transaction logging for the Sample application.  As a result, transaction logging will be enabled for all applications except the Sample application. A location on a physical disk other than the disk where ARBORPATH or the disk files reside is recommended to optimize overall Essbase performance. Configuring Transaction Log Replay Although transaction log entries are stored based on the LOGLOCATION parameter of the TRANSACTIONLOGLOCATION essbase.cfg setting, copies of data load and rules files are stored in the ARBORPATH/app/appname/dbname/Replay directory to optimize the performance of replaying logged transactions.  The default is to archive client data loads, but this configuration setting can be used to archive server data loads (including SQL server data loads) or both client and server data loads. To change the type of data to be archived, add the TRANSACTIONLOGDATALOADARCHIVE configuration setting to the essbase.cfg file.  Note that you can have multiple TRANSACTIONLOGDATALOADARCHIVE entries in the essbase.cfg file to adjust settings for individual applications and databases. Replaying the Transaction Log and Transaction Log Security Considerations To replay the transactions, use either the Replay Transactions command in the EAS console or the alter database MAXL command using the replay transactions grammar.  Transactions can be replayed either after a specified log time or using a range of transaction sequence IDs. The default when replaying transactions is to use the security settings of the user who originally performed the transaction.  However, if that user no longer exists or that user's username was changed, the replay operation will fail. Instead of using the default security setting, add the REPLAYSECURITYOPTION essbase.cfg setting to use the security settings of the administrator who performs the replay operation.  REPLAYSECURITYOPTION 2 will explicitly use the security settings of the administrator performing the replay operation.  REPLAYSECURITYOPTION 3 will use the administrator security settings if the original user’s security settings cannot be used. Removing Transaction Logs and Archived Replay Data Load and Rules Files Transaction logs and archived replay data load and rules files are not automatically removed and are only removed manually.  Since these files can consume a considerable amount of space, the files should be removed on a periodic basis. The transaction logs should be removed one database at a time instead of all databases simultaneously.  The data load and rules files associated with the replayed transactions should be removed in chronological order from earliest to latest.  In addition, do not remove any data load and rules files with a timestamp later than the timestamp of the most recent archive file. Partitioned Database Considerations For partitioned databases, partition commands such as synchronization commands cannot be replayed.  When recovering data, the partition changes must be replayed manually and logged transactions must be replayed in the correct chronological order. If the partitioned database includes any @XREF commands in the calc script, the logged transactions must be selectively replayed in the correct chronological order between the source and target databases. References For additional information, please see the Oracle EPM System Backup and Recovery Guide.  For EPM 11.1.2.2, the link is http://docs.oracle.com/cd/E17236_01/epm.1112/epm_backup_recovery_1112200.pdf

    Read the article

  • Flash Webcam non responsive

    - by powerbuoy
    I've got the same problem as this gentleman: https://answers.launchpad.net/ubuntu/+source/flashplugin-nonfree/+question/176541 Where the webcam settings / access does not work at all / is completely unresponsive in Ubuntu 11.10. I've tried webcam access in Facebook, Google+, my own code + a number of tutorials/demos and none work. What happens is the settings dialogue is completely unresponsive. Clicking tabs or buttons does nothing. In the question linked to a suggested answer is to run Unity 2D. Unfortunately this does not work for me (the exact same thing happens). I've also tried Gnome 3 which also does not work. Note that it is only the webcam settings that don't work. YouTube videos and annoying banners work just fine. Does anyone know of a workaround for this (except going back to 11.04) or if they've fixed this in 12.04? - also, are any of you experiencing the same thing?

    Read the article

  • Global menu not showing in Ubuntu 12.10 with Unity

    - by William
    I am on Ubuntu 12.10 with Unity shell. I am not quite sure what happened, but today I noticed that the menubar (containing 'File, Edit, etc...') is not appearing on the panel as it used to (i.e. the global application menu), but on the application window under the title bar. The only application that still uses global menu is Chromium. Any ideas what settings may have changed? Also, when I right click on the desktop and select "Change Desktop Background", System Settings opens, instead of the "Appearance" settings. Apparently, here is a related question, with no answer: Global menu and HUD suddenly broken EDIT: I noticed this behavior on three of my computers after an update (I do not remember exactly which packages were updated). Now when I try to run apt-get upgrade, I get: The following packages have been kept back: gnome-control-center gnome-control-center-data I guess something is going on with updates. Any info would be appreciated.

    Read the article

  • LibreOffice Writer Error due to a lock file

    - by Brad
    Every time I try to open a word document I get this error. LibreOffice 3.5 Either another instance of LibreOffice is accessing your personal settings or your personal settings are locked. Simultaneous access can lead to inconsistencies in your personal settings. Before continuing, you should make sure user "closes LibreOffice on host". Do you want to continue? I've been onto different forums to find out a remedy but nothing yet. I've recently done a fresh install of 12.04 (64bit) and have been having some issues with different programs and have been able to work through most of them but this one, I'm lost.

    Read the article

  • Wobble effects not working on Ubuntu 12.04

    - by Sunny Deere
    Just couple of days before installed Ubuntu 12.04 with all the necessary upgradations. Also installed Compiz Settings Manager & loved the animations & wobble effects of windows. But after restarting windows today, started receiving error messages & Wobble effects are disappeared. Also, other settings created by me in my last login, like small sidebar icons, too disappeared & unable to find even option in "All Settings Appearance" to resize the icons. Loved Ubuntu 12.04, specially Animation & Wobble effects but unhappy with these errors. Thanks in advance.

    Read the article

  • Why is the Touchpad tab missing?

    - by U47
    So my Dell Vostro 3360 came out of the box with an Alps touchpad which was detected as PS/2. Through these steps, I was able to get as far as it appearing to be recognized as a touchpad. Howevver, I do not get the Touchpad tab in the Mouse/Touchpad settings. Two-finger scrolling appears to work, and the Fn-F3 does disable the touchpad, but the settings are so brutal that it is almost impossible to work with. Very difficult to scroll at all accurately. I'm hoping there's some settings which can be tweaked which will help with this. Any ideas? My xinput listing is as follows: ? Virtual core pointer id=2 [master pointer (3)] ? ? Virtual core XTEST pointer id=4 [slave pointer (2)] ? ? GlidePoint Virtual Touchpad id=12 [slave pointer (2)] Thanks,

    Read the article

  • Change hotkeys/shortcut keys for volume control

    - by Daniel le Rouge
    I need to change the hotkeys for volume control, since I suffer from a bug, which disallows me to use the standard function hotkeys of my notebook. Since I have updated I to Oneiric, I am not able to change the settings for the hotkeys. It is neither possible in system settings nor in gconf-editor. Current buggy configuration: Volume mute: Fn + F3 Volume down: Fn + F4 Volume up: Fn + F5 Desired configuration: Volume mute: Ctrl + F3 Volume down: Ctrl + F4 Volume up: Ctrl + F5 If you need further information, I will be happy to provide it. I tried to overwrite the standard settings by creating a new one in the “Custom shortcuts” category. Even this attempt is unsuccessful. Is there a possibility to access this menu as root?

    Read the article

  • No 'Hardware' tab in audio and no profiles

    - by Gene
    If I run the 12.x ubuntu (latest May 2012) from the CD, I get full audio settings, and sound playing in speaker. Profiles let me change analog to digital in/out. Once I run install from the same CD onto the laptop HD, once it boots the first time, after selecting audio settings, there is no 'Hardware' tab and no way to change profiles. Worst part is the audio device is set to SPDIF so nothing comes out of the speakers. Very off how booting off the CD I can get analog audio, and installing to HD and booting seems to limit the profile to something useless. Laptop is a 5 year old Dell D820 with Nvidea 128meg video on a 1920x1200 screen and T7200 CPU. I suspect if I could get the damn HARDWARE tab back in audio settings, I could just select the proper Analog profile - just as is the case if running from a boot CD. Searched the web, no similar problems found... any help appreciated!

    Read the article

  • How do I edit my resolv.conf file?

    - by Ahatius
    I have the problem that my ubuntu machine uses the wrong dns server. For some reason he queries localhost for dns records. I have added the dns server in the network settings gui, but /etc/resolv.conf still contains 127.0.0.1 as dns server. Now I tought I could just edit the file by myself, but it explicitly says I should not edit the file by hand. Now, since the network settings GUI didn't generate the file with the right settings, how do I generate a new resolv.conf file by myself?

    Read the article

  • What user-friendly term should I use for a view that lives under a tab in a tab bar app?

    - by Emile Cormier
    My app uses a tab bar controller. In the user documentation, I'm not sure what name to use for a view that lives under a tab. For example, the app has a Settings tab. In the user documentation, I have a sentence that goes something like this: This threshold can be adjusted in the Settings tab. "Settings tab" is not terribly user-friendly. What would be a better term than "tab"? I've looked though Apple's Human Interface Guideline, but I can't find what would be the official user-friendly term for "view that lives under a tab".

    Read the article

< Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >