Search Results

Search found 7183 results on 288 pages for 'export to excel'.

Page 135/288 | < Previous Page | 131 132 133 134 135 136 137 138 139 140 141 142  | Next Page >

  • Exporting PowerPoint Slides with Specific Heights and Widths

    - by Damon Armstrong
    I found myself in need of exporting PowerPoint slides from a presentation and was fairly excited when I found that you could save them off in standard image formats. The problem is that Microsoft conveniently exports all images with a resolution of 960 x 720 pixels, which is not the resolution I wanted.  You can, however, specify the resolution if you are willing to put a macro into your project: Sub ExportSlides()   For i = 1 To ActiveWindow.Selection.SlideRange.Count     Dim fileName As String     If (i < 10) Then       fileName = "C:\PowerPoint Export\Slide" & i & ".png"     Else       fileName = "C:\PowerPoint Export\Slide0" & i & ".png"     End If     ActiveWindow.Selection.SlideRange(i).Export fileName, "PNG", 1280, 720   Next End Sub When you call the Export method you can specify the file type as well as the dimensions to use when creating the image.  If the macro approach is not your thing, then you can also modify the default settings through the registry: http://support.microsoft.com/kb/827745

    Read the article

  • Models with more than one mesh in JMonkeyEngine

    - by Andrea Tucci
    I’m a new jmonkey engine developer and I’m beginning to import models. I tried to import simple models and no problems appeared, but when I export some obj models having more than one mesh in the OgreXML format, Blender saves multiple meshes with their own materials (e.g. one mesh for face, another for body etc). Can I export all the meshes in one? I’ve tried to join all the meshes to a major one with blender (face joins body), but when I export the model and then create the Spatial in jme(loading the path of the “merged” mesh), all the meshes that are joined to the major doesn’t have their materials! I give a more clear example: I have an .obj model with 3 meshes and I export it. I have : mesh1.mesh.xml , mesh2.mesh.xml , mesh3.mesh.xml and their materials mesh1.material, mesh2.material mesh3.material so I import the folder in Assets/Models/Test and now I have to create something like: Spatial head = assetManager.loadModel( [path] ); Spatial face = assetManager.loadModel( [path] ) one for each mesh and than attach them to a common node. I think there is a way to merge those mesh maintaining their materials! What do you think? Thanks

    Read the article

  • create a .deb Package from scripts or binaries

    - by tdeutsch
    I searched for a simple way to create .deb Packages for things which have no source code to compile (configs, shellscripts, proprietary software). This was quite a problem because most of the package tutorials are assuming you have a source tarball you want to compile. Then I've found this short tutorial (german). Afterwards, I created a small script to create a simple repository. Like this: rm /export/my-repository/repository/* cd /home/tdeutsch/deb-pkg for i in $(ls | grep my); do dpkg -b ./$i /export/my-repository/repository/$i.deb; done cd /export/avanon-repository/repository gpg --armor --export "My Package Signing Key" > PublicKey apt-ftparchive packages ./ | gzip > Packages.gz apt-ftparchive packages ./ > Packages apt-ftparchive release ./ > /tmp/Release.tmp; mv /tmp/Release.tmp Release gpg --output Release.gpg -ba Release I added the key to the apt keyring and included the source like this: deb http://my.default.com/my-repository/ ./ It looks like the repo itself is working well (I ran into some problems, to fix them I needed to add the Packages twice and make the temp-file workaround for the Release file). I also put some downloaded .deb into the repo, it looks like they are also working without problems. But my self created packages didn't... Wenn i do sudo apt-get update, they are causing errors like this: E: Problem parsing dependency Depends E: Error occurred while processing my-printerconf (NewVersion2) E: Problem with MergeList /var/lib/apt/lists/my.default.com_my-repository_._Packages E: The package lists or status file could not be parsed or opened. Has anyone an idea what I did wrong?

    Read the article

  • Automated backups for Windows Azure SQL Database

    - by Greg Low
    One of the questions that I've often been asked is about how you can backup databases in Windows Azure SQL Database. What we have had access to was the ability to export a database to a BACPAC. A BACPAC is basically just a zip file that contains a bunch of metadata along with a set of bcp files for each of the tables in the database. Each table in the database is exported one after the other, so this does not produce a transactionally-consistent backup at a specific point in time. To get a transactionally-consistent copy, you need a database that isn't in use.The easiest way to get a database that isn't in use is to use CREATE DATABASE AS COPY OF. This creates a new database as a transactionally-consistent copy of the database that you are copying. You can then use the export options to get a consistent BACPAC created.Previously, I've had to automate this process by myself. Given there was also no SQL Agent in Azure, I used a job in my on-premises SQL Server to do this, using a linked server configuration.Now there's a much simpler way. Windows Azure SQL Database now supports an automated export function. On the Configuration tab for the database, you need to enable the Automated Export function. You can configure how often the operation is performed for you, and which storage account will be used for the backups.It's important to consider the cost impacts of this as well. You are charged for how ever many databases are on your server on a given day. So if you enable a daily backup, you will double your database costs. Do not schedule the backups just before midnight UTC, as that could cause you to have three databases each day instead of one.This is a much needed addition to the capabilities. Scott Guthrie also posted about some other notable changes today, including a preview of a new premium offering for SQL Database. In addition to the Web and Business editions, there will now be a Premium edition that has reserved (rather than shared) resources. You can read about it all in Scott's post here: http://weblogs.asp.net/scottgu/archive/2013/07/23/windows-azure-july-updates-sql-database-traffic-manager-autoscale-virtual-machines.aspx

    Read the article

  • Is using the student version of 3DS Max and Unity3d legal?

    - by SubZeron
    I am developing an indie game together with my friend using Unity3D engine. I bought "Silo 3D" for modeling two month ago and for texturing I use 3D coat. We plan to sell our game in the future. For the animations I work with 3DS max (only animation part). My question is, can I work with a students license? The license for the original version is too expensive for me. I am still at the university and I can not buy the 3DS Max license which costs 4000 €. As an alternative I have the choice beetween Blender (can´t work with this software and don't have time to invest for learning a new program) and Truespace (can´t export fbx animation and specially with bones) so for me, 3DS Max is the best choice to be effective and quick. Is it possible to prove it when I export my fbx characters from 3DS Max to Unity3D? I mean can they find out that I have used the students license of 3DS Max for the animations after the release of the game? Maybe with help of DRM? Can I solve that problem when I export the fbx from 3DS Max to Blender and after that export the same fbx to Unity3D?

    Read the article

  • CodePlex Daily Summary for Tuesday, October 15, 2013

    CodePlex Daily Summary for Tuesday, October 15, 2013Popular ReleasesFFXIV Crafting Simulator: Crafting Simulator 2.4.1: -Fixed the offset for the new patch (Auto Loading function)iBoxDB.EX - Fast Transactional NoSQL Database Resources: iBoxDB.net fast transactional nosql database 1.5.2: Easily process objects and documents, zero configuration. fast embeddable transactional nosql document database, includes CURD, QueryLanguage, Master-Master-Slave Replication, MVCC, etc. supports .net2, .net4, windows phone, mono, unity3d, node.js , copy and run. http://download-codeplex.sec.s-msft.com/Download?ProjectName=iboxdb&DownloadId=737783 Benchmark with MongoDB Compatibility more platforms for java versionneurogoody: slicebox: this is the slice box jsEvent-Based Components AppBuilder: AB3.AppDesigner.55: Iteration 55 (Feature): Moving of TargetEdge (simple wires only) by mouse.Sandcastle Help File Builder: SHFB v1.9.8.0 with Visual Studio Package: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This new release contains bug fixes and feature enhancements. There are some potential breaking changes in this release as some features of the Help File Builder have been moved into...SharpConfig: SharpConfig 1.2: Implemented comment parsing. Comments are now part of settings and setting categories. New properties: Setting: Comment PreComments SettingCategory: Comment PreCommentsC++ REST SDK (codename "Casablanca"): C++ REST SDK 1.3.0: This release fixes multiple customer reported issues as well as the following: Full support for Dev12 binaries and project files Full support for Windows XP New sample highlighting the Client and Server APIs : BlackJack Expose underlying native handle to set custom options on http_client Improvements to Listener Library Note: Dev10 binaries have been dropped as of this release, however the Dev10 project files are still available in the Source CodeAD ACL Scanner: 1.3.2: Minor bug fixed: Powershell 4.0 will report: Select—Object: Parameter cannot be processed because the parameter name p is ambiguous.Json.NET: Json.NET 5.0 Release 7: New feature - Added support for Immutable Collections New feature - Added WriteData and ReadData settings to DataExtensionAttribute New feature - Added reference and type name handling support to extension data New feature - Added default value and required support to constructor deserialization Change - Extension data is now written when serializing Fix - Added missing casts to JToken Fix - Fixed parsing large floating point numbers Fix - Fixed not parsing some ISO date ...RESX Manager: ResxManager 0.2.1: FIXED: Many critical bugs have been fixed. New Features Error logging for improved exception handling New toolbar Improvements of user interfaceFast YouTube Downloader: YouTube Downloader 2.2.0: YouTube Downloader 2.2.0VidCoder: 1.5.8 Beta: Added hardware acceleration options: Bicubic OpenCL scaling algorithm, QSV decoding/encoding and DXVA decoding. Updated HandBrake core to SVN 5834. Updated VidCoder setup icon. Fixed crash when choosing the mp4v2 container on x86 and opening on x64. Warning: the hardware acceleration features require specific hardware or file types to work correctly: QSV: Need an Intel processor that supports Quick Sync Video encoding, with a monitor hooked up to the Intel HD Graphics output and the lat...ASP.net MVC Awesome - jQuery Ajax Helpers: 3.5.2: version 3.5.2 - fix for setting single value to multivalue controls - datepicker min max date offset fix - html encoding for keys fix - enable Column.ClientFormatFunc to be a function call that will return a function version 3.5.1 - fixed html attributes rendering - fixed loading animation rendering - css improvements version 3.5 ========================== - autosize for all popups ( can be turned off by calling in js awe.autoSize = false ) - added Parent, Paremeter extensions ...Wsus Package Publisher: Release v1.3.1310.12: Allow the Update Creation Wizard to be set in full screen mode. Fix a bug which prevent WPP to Reset Remote Sus Client ID. Change the behavior of links in the Update Detail Viewer. Left-Click to open, Right-Click to copy to the Clipboard.TerrariViewer: TerrariViewer v7 [Terraria Inventory Editor]: This is a complete overhaul but has the same core style. I hope you enjoy it. This version is compatible with 1.2.0.3 Please send issues to my Twitter or https://github.com/TJChap2840WDTVHubGen - Adds Metadata, thumbnails and subtitles to WDTV Live Hubs: WDTVHubGen.v2.1.6.maint: I think this covers all of the issues. new additions: fixed the thumbnail problem for backgrounds. general clean up and error checking. need to get this put through the wringer and all feedback is welcome.BIDS Helper: BIDS Helper 1.6.4: This BIDS Helper release brings the following new features and fixes: New Features: A new Bus Matrix style report option when you run the Printer Friendly Dimension Usage report for an SSAS cube. The Biml engine is now fully in sync with the supported subset of Varigence Mist 3.4. This includes a large number of language enhancements, bugfixes, and project deployment support. Fixed Issues: Fixed Biml execution for project connections fixing a bug with Tabular Translations Editor not a...MoreTerra (Terraria World Viewer): MoreTerra 1.11.3: =========== =New Features= =========== New Markers added for Plantera's Bulb, Heart Fruits and Gold Cache. Markers now correctly display for the gems found in rock debris on the floor. =========== =Compatibility= =========== Fixed header changes found in Terraria 1.0.3.1Media Companion: Media Companion MC3.581b: Fix in place for TVDB xml issue. New* Movie - General Preferences, allow saving of ignored 'The' or 'A' to end of movie title, stored in sorttitle field. * Movie - New Way for Cropping Posters. Fixed* Movie - Rename of folders/filename. caught error message. * Movie - Fixed Bug in Save Cropped image, only saving in Pre-Frodo format if Both model selected. * Movie - Fixed Cropped image didn't take zoomed ratio into effect. * Movie - Separated Folder Renaming and File Renaming fuctions durin...SmartStore.NET - Free ASP.NET MVC Ecommerce Shopping Cart Solution: SmartStore.NET 1.2.0: HighlightsMulti-store support "Trusted Shops" plugins Highly improved SmartStore.biz Importer plugin Add custom HTML content to pages Performance optimization New FeaturesMulti-store-support: now multiple stores can be managed within a single application instance (e.g. for building different catalogs, brands, landing pages etc.) Added 3 new Trusted Shops plugins: Seal, Buyer Protection, Store Reviews Added Display as HTML Widget to CMS Topics (store owner now can add arbitrary HT...New ProjectsArtezio SharePoint 2013 Workflow Activities: SharePoint Workflow 2013 doesn’t provide activities to work with permissions, we've fixed it using HttpSend activity that makes REST API calls.Dependency.Injection: An attempt to write a really simple dependency injection framework. Does property-based and recursive dependency injection. Handles singletons. Yay!DHGMS SUO Killer: SUO Killer is a Visual Studio extension to deal with the removal of SUO file to mitigate SUO related issues in Visual Studio. This project is written in C#.dynamicsheet: dynamicsheetExcel Comparator: Excel Comparator is an add-in for Microsoft Excel that allows the user to compare a range between two sheets. FetchAIP: FetchAIP is a utility to download the various sections of the Aeronautical Information Publication (AIP) for New Zealand.Fluent Method and Type Builder: Still working on the summary.getboost: NuGet package for Boost framework.Goldstone Forum: WebForms Forum - TelerikAcademy Team ProjectGroupMe Software Development Kit: .NET Software Development Kit for http://groupme.com/ chat service.GSLMS: ----Import Excel Files Into SQL Server: Load Excel files into SQL Database without schema changes.Inaction: ?????????? jBegin: Learning ASP.net MVC from beginning, then here will be the source code for jbegin.comKDG's Statistical Quality Control Solver: This tool will include methods that can solve sample standard deviation, sample variance, median, mode, moving average, percentiles, margin of error, etc.kpi: Key Performance Indicator (KPI)????; visual studio 2010 with .NET 4.0 runtimeLECO Remote Control Client Application: Sample code and binaries are provided to demonstrate the remote control capability of a LECO Cornerstone instrument.LinkPad: My first Windows Store app intended for student to sketch up thoughts and concepts in quick diagrams.Modler.NET - Automating Graphical Data Model Co-Evolution: Modler.NET was the tool created for a Master's thesis project, which automates the co-evolution of graphical data models and the database that they represent.MyFileManager1: SummaryNever Lotto: Korean 465 Lotto Analyzer and Simulator. The real purpose of this project is to show that this kind of lotto things are just shit.NHibernate: The purpose of this project is to demo CRUD operations using NHibernate with Mono in Visual Studio 2012 using C# language. OAuth2 Authorizer: OAuth2 Authorizer helps you get the access code for a standard OAuth2 REST service that implements 3-legged authentication.Regular Expression for Excel: Regular Expression For Excel is an Excel Plugin. It provides a regular expressions EXCEL support. We can use it in the EXCEL function.Service Tester: Service Tester is an Azure Cloud based load testing application targeted at Soap Web Services which allows you to invoke your Web Service by random parameters.Simple TypeScript and C# Class Generator: Simple GUI application to generate compatible class source code for C# and TypeScript for communications between C# and TypeScript. Soccer team management: ---Spanner: No more stringly-typed web development! Build statically typed single page web applications in C#, automatically generating all HTML, JavaScript, and Knockout.

    Read the article

  • Data Mining Resources

    - by Dejan Sarka
    There are many different types of analyses, each one with its own pros and cons. Relational reports have a predefined structure, and end users cannot change it. They are simple to use for end users. Reports can use real-time data and snapshots of data to show the state of a report at specific points in time. One of the drawbacks is that report authoring is limited to IT pros and advanced users. Any kind of dynamic restructuring is very limited. If real-time data is used for a report, the report has a negative impact on the performance of the source system. Processing of the reports might be slow because the data comes from relational database management systems, which are not optimized for reporting only. If you create a semantic model of your data, your end users can create ad-hoc report structures. However, the development is more complex because a developer is needed to create these semantic models. For OLAP, you typically use specialized database management systems. You get lightning speed of analyses. End users can use rich and thin clients to interactively change the structure of the report. Typically, they do it graphically. However, the development of an OLAP system is many times quite complex. It involves the preparation and maintenance of an enterprise data warehouse and OLAP cubes. In order to exploit the possibility of real-time restructuring of reports, the users must be both active and educated. The data is usually stale, as it is loaded into data warehouses and OLAP cubes with a scheduled process. With data mining, a structure is not selected in advance; it searches for the structure. As a result, data mining can give you the most valuable results because you can discover patterns you did not expect. A data mining model structure is limited only by the attributes that you use to train the model. One of the drawbacks is that a lot of knowledge is needed for a successful data mining project. End users have to understand the results. Subject matter experts and IT professionals need to understand business problem thoroughly. The development might be sometimes even more complex than the development of OLAP cubes. Each type of analysis has its own place in an enterprise system. SQL Server has tools for all kinds of analyses. However, data mining is the most advanced way of analyzing the data; this is the “I” in BI. In order to get the most out of it, you need to learn quite a lot. In this blog post, I am gathering together resources for learning, including forthcoming events. Books Multiple authors: SQL Server MVP Deep Dives – I wrote an introductory data mining chapter there. Erik Veerman, Teo Lachev and Dejan Sarka: MCTS Self-Paced Training Kit (Exam 70-448): Microsoft SQL Server 2008 - Business Intelligence Development and Maintenance – you can find a good overview of a complete BI solution, including data mining, in this book. Jamie MacLennan, ZhaoHui Tang, and Bogdan Crivat: Data Mining with Microsoft SQL Server 2008 – can’t miss this book if you want to mine your data with SQL Server tools. Michael Berry, Gordon Linoff: Mastering Data Mining: The Art and Science of Customer Relationship Management – data mining from both, business and technical perspective. Dorian Pyle: Data Preparation for Data Mining – an in-depth book about data preparation. Thomas and Ronald Wonnacott: Introductory Statistics – if you thought that you could get away without statistics, then you are not serious about data mining. Jiawei Han and Micheline Kamber: Data Mining Concepts and Techniques – in-depth explanation of the most popular data mining algorithms. Michael Berry and Gordon Linoff: Data Mining Techniques – another book that explains data mining algorithms, more fro a business perspective. Paolo Guidici: Applied Data Mining – very mathematical book, only if you enjoy statistics and mathematics in general. Forthcoming presentations I am presenting two data mining related sessions during the PASS Summit in Charlotte, NC: Wednesday, October 16th, 2013 - Fraud Detection: Notes from the Field – I am showing how to use data mining for a specific business problem. The presentation is based on real-life projects. Friday, October 18th: Excel 2013 Advanced Analytics – I am focusing on Excel Data Mining Add-ins, and how to use them together with Power Pivot and other add-ins. This is the most you can get out of Excel. Sinergija 2013, Belgrade, Serbia Tuesday, October 22nd: Excel 2013 Analytics to the Max – another presentation focusing on the most advanced analytics you can get in Excel. SQL Rally Amsterdam, Netherlands Thursday, November 7th: Advanced Analytics in Excel 2013 – and again I am presenting about data mining in Excel. Why three different titles for the same presentation? I don’t know, I guess I forgot the name I proposed every time right after I sent the proposal. Courses Data Mining with SQL Server 2012 – I wrote a 3-day course for SolidQ. If you are interested in this course, which I could also deliver in a shorter seminar way, you can contact your closes SolidQ subsidiary, or, of course, me directly on addresses [email protected] or [email protected]. This course could also complement the existing courseware portfolio of training providers, which are welcome to contact me as well. OK, now you know: no more excuses, start learning data mining, get the most out of your data

    Read the article

  • Using openrowset to read an Excel file into a temp table; how do I reference that table?

    - by mattstuehler
    I'm trying to write a stored procedure that will read an Excel file into a temp table, then massage some of the data in that table, then insert selected rows from that table into a permanent table. So, it starts like this: SET @SQL = "select * into #mytemptable FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'Excel 8.0;Database="+@file+";HDR=YES', 'SELECT * FROM [Sheet1$]')" EXEC (@SQL) That much seems to work. However, if I then try something like this: Select * from #mytemptable I get an error: Invalid object name '#mytemptable' Why isn't #mytemptable recognized? Is there a way to have #mytemptable accessible to the rest of the stored procedure? Many thanks in advance!

    Read the article

  • Why are cookies unrecognized when a link is clicked from an external source (i.e. Excel, Word, etc..

    - by Nick
    I noticed that when a link is clicked externally from the web browser, such as from Excel or Word, that my session cookie is initially unrecognized, even if the link opens up in a new tab of the same browser window. The browser ends up recognizing its cookie eventually, but I am puzzled as to why that initial link from Excel or Word doesn't work. To make it even more challenging, clicking a link works fine from Outlook. Does anybody know why this might be happening? I'm using the Zend Framework with PHP 5.3.

    Read the article

  • Animations in FBX exported from Maya are anchored in the wrong place

    - by Simon P Stevens
    We are trying to export a model and animation from Maya into Unity3d. In Maya, the model is anchored (pivot point) at the feet (and the body moves up and down). However after we have performed the FBX export, and imported the file into Unity the model is now appears to be anchored by the waist/head and the feet move. These example videos probably help explain the problem more clearly: Example video - Maya - Correct Example video - Unity - Wrong We have also noticed that if we take the FBX file and import it back into Maya we have exactly the same problem. It seems to be that the constraints no longer work after the FBX is reimported back to Maya, which just kills the connection between the joints and the control objects. When we exported the FBX we have tried checking the 'bake animations' check box. The fact that the same problem exist when importing the FBX back into both Maya and Unity suggests that the source of the problem is most likely with the Maya FBX export. Has anyone encountered this problem before and have any ideas how to fix it?

    Read the article

  • Exporting SWF With Transparent Background For Scaleform/UDK

    - by Alex Shepard
    After looking all over in the UDN and forums I have yet to find a solution for this: I am currently using Flash CS3 and Actionscript 2.0 to build my scaleform menus and I can use them in the UDK. For various reasons I can't use the handy plugin Autodesk supplies to enable this export so I publish my flash documents to swf the old fassioned way and manually use the gfxexport.exe tool to get my .gfx file. I can then import into the UDK the normal way. My problem is that the flash movies that I import will not alpha blend even if the material is set to blend in the alpha channel of the target render texture. My project images are set up to export properly. My classpath for Actionscript 2.0 is set to the correct location. My HTML publish settings have window mode set to Transparent Windowless. Is it possible to export without the scaleform flash extension and still get the desired effects and if so how might I do so? Am I merely missing something from my project setup?

    Read the article

  • "Please ensure you have JAVA_HOME points to JDK rather than JRE" message

    - by Alex Malex
    I have java installed aaa@ubuntu:~$ whereis java java: /usr/bin/java /usr/bin/X11/java /usr/local/java /usr/share/java aaa@ubuntu:~$ whereis javac javac: /usr/bin/javac /usr/bin/X11/javac and etc/profile JAVA_HOME=/usr/local/java/jdk1.7.0_17 PATH=$PATH:$HOME/bin:$JAVA_HOME/bin JRE_HOME=/usr/local/java/jre1.7.0_17 PATH=$PATH:$HOME/bin:$JRE_HOME/bin export JAVA_HOME export JRE_HOME export PATH However, when I run Android Studio, it says: tools.jar in not in Android Studio classpath. Please ensure you have JAVA_HOME points to JDK rather than JRE. How do I fix it? update sudo update-alternatives --get-selections | grep ^java java manual /usr/local/java/jre1.7.0_17/bin/java javac manual /usr/local/java/jdk1.7.0_17/bin/javac javaws manual /usr/local/java/jre1.7.0_17/bin/javaws java -version java version "1.7.0_17"

    Read the article

  • Why ~/.bash_profile is not getting sourced when opening a terminal in Ubuntu 11.04?

    - by Viriato
    Problem I have an Ubuntu 11.04 Virtual Machine and I wanted to set up my Java development environment. I did as follows sudo apt-get install openjdk-6-jdk Added the following entries to ~/.bash_profile export JAVA_HOME=/usr/lib/jvm/java-6-openjdk export PATH=$PATH:$JAVA_HOME/bin Save the changes and exit Open up a terminal again and typed the following echo $JAVA_HOME (blank) echo $PATH (displayed, but not the JAVA_HOME value) Nothing happened, like if the export of JAVA_HOME and it's addition to the PATH were never done. Solution I had to go to ~/.bashrc and add the following entry towards the end of file #Source bash_profile to set JAVA_HOME and add it to the PATH because for some reason is not being picked up . ~/.bash_profile Questions Why did I have to do that? I thought bash_profile, bash_login or profile in absence of those two get executed first before bashrc. Was in this case my terminal a non-login shell? If so, why when doing su after the terminal and putting the password it did not execute profile where I had also set the exports mentioned above?

    Read the article

  • Problem in filtering records using Dataview (C#3.0)

    - by Newbie
    I have a data table . The data table is basically getting populated from excel sheet. And there are many excel sheets. Henceforth, I have written a utility method for accomplishing the same. Now in some of the excel sheets, there are date columns and in some it is not(only text/string). My function is populating the values properly into the datatable from the excell sheet. But there are many blank rows in the excel sheets some are filled with NULL , some with " ". So I need to filter those records (which are NULL or " " ) first before further processing. What I am after is to use a dataview and apply the filter over there. DataView dv = dataTable.DefaultView; dv.RowFilter = ColumnName + " <> ''"; Well by using metedata (GetOleDbSchemaTable(OleDbSchemaGuid.Columns, restrection)) I was able to get the column names from the excel sheet , so getting the column names is not an issue. But the problem is as I said in some Excel sheet there are date fileds some are not. So the Filter condition of the Dataview needs to be proper. If I apply the above logic, and if it encounters a Datafield, it is throwing error Cannot perform '<' operation on System.DateTime and System.String. Could you people please help me out? I need to filter columns(not known at compile time + their data types) which can have NULL and " " I am using C#3.0 Thanks

    Read the article

  • Exited event of Process is not rised?

    - by Kanags.Net
    In my appliation,I am opening an excel sheet to show one of my Excel documents to the user.But before showing the excel I am saving it to a folder in my local machine which in fact will be used for showwing. While the user closes the application I wish to close the opened excel files and delete all the excel files which are present in my local folder.For this, in the logout event I have written code to close all the opened files like shown below, Process[] processes = Process.GetProcessesByName(fileType); foreach (Process p in processes) { IntPtr pFoundWindow = p.MainWindowHandle; if (p.MainWindowTitle.Contains(documentName)) { p.CloseMainWindow(); p.Exited += new EventHandler(p_Exited); } } And in the process exited event I wish to delete the excel file whose process is been exited like shown below void p_Exited(object sender, EventArgs e) { string file = strOriginalPath; if (File.Exists(file)) { //Pdf issue fix FileStream fs = new FileStream(file, FileMode.Open, FileAccess.Read); fs.Flush(); fs.Close(); fs.Dispose(); File.Delete(file); } } But the problem is this exited event is not called at all.On the other hand if I delete the file after closing the MainWindow of the process I am getting an exception "File already used by another process". Could any help me on how to achieve my objective or give me an reason why the process exited event is not being called?

    Read the article

  • How do I know if I am using Scrum methodologies?

    - by Jake
    When I first started at my current job, my purpose was to rewrite a massive excel-VBA workbook-application to C# Winforms because it was thought that the new C# app will fix all existing problems and have all the new features for a perfect world. If it were a direct port, in theory it would be easy as i just need to go through all the formulas, conditional formatting, validations, VBA etc. to understand it. However, that was not the case. Many of the new features are tightly dependant on business logic which I am unfamiliar with. As a solo programmer, the first year was spent solely on deciphering the excel workbook and writing the C# app. In theory, I had the business people to "help" me specify requirements, how GUI looks and work, and testing of the app etc; but in practice it is like a contant tsunami of feature creep. At the beginning of the second year I managed to convince the management that this is not going anywhere. I made them start from scratch with the excel-VBA. I have this "issue log" saved on the network, each time they found something they didn't like about the excel-VBA app, they will write it in there. I check the log daily and consolidate issues (in my mind) mainly into 2 groups: (1) requires massive change. (2) can be fixed in current version. For massive change issues, I make a copy of the latest excel-VBA and give it a new version number, then work on it whenever I can. For current version fixes, I make the changes in a few days to a week, and then immediately release it. I also ensure I update the same change in any in-progress massive change future versions. This has gone on for about 4 months and I feel it works great. I made many releases and solved many real issues, also understood the business logic more and more. However, my boss (non-IT trained) thinks what I am doing are just adhoc changes and that i am not looking at the "bigger picture". I am struggling to convince my boss that this works. So I hope to formalise my approach and maybe borrow a buzzword to confuse him. Incidentally, I read about Agile and SCRUM, about backlog and sprints. But it's all very vague to me still. QUESTION (finally): I want to tell him that this is SCRUM! But I want to hold my breath first and ask whether my current approach is considered SCRUM or SCRUM-like? How can I make it more SCRUM-like? Note that I have only myself, there's no project leader or teams.

    Read the article

  • How To Delete Built-in Windows 7 Power Plans (and Why You Probably Shouldn’t)

    - by The Geek
    Do you actually use the Windows 7 power management features? If so, have you ever wanted to just delete one of the built-in power plans? Here’s how you can do so, and why you probably should leave it alone. Just in case you’re new to the party, we’re talking about the power plans that you see when you click on the battery/plug icon in the system tray. The problem is that one of the built-in plans always shows up there, even if you only use custom plans. When you go to “More power options” on the menu there, you’ll be taken to a list of them, but you’ll be unable to get rid of any of the built-in ones, even if you have your own. You can actually delete the power plans, but it will probably cause problems, so we highly recommend against it. If you still want to proceed, keep reading. Delete Built-in Power Plans in Windows 7 Open up an Administrator mod command prompt by right-clicking on the command prompt and choosing “Run as Administrator”, then type in the following command, which will show you a whole list of the plans. powercfg list Do you see that really long GUID code in the middle of each listing? That’s what we’re going to need for the next step. To make it easier, we’ll provide the codes here, just in case you don’t know how to copy to the clipboard from the command prompt. Power Scheme GUID: 381b4222-f694-41f0-9685-ff5bb260df2e  (Balanced) Power Scheme GUID: 8c5e7fda-e8bf-4a96-9a85-a6e23a8c635c  (High performance)Power Scheme GUID: a1841308-3541-4fab-bc81-f71556f20b4a  (Power saver) Before you do any deleting, what you’re going to want to do is export the plan to a file using the –export parameter. For some unknown reason, I used the .xml extension when I did this, though the file isn’t in XML format. Moving on… here’s the syntax of the command: powercfg –export balanced.xml 381b4222-f694-41f0-9685-ff5bb260df2e This will export the Balanced plan to the file balanced.xml. And now, we can delete the plan by using the –delete parameter, and the same GUID.  powercfg –delete 381b4222-f694-41f0-9685-ff5bb260df2e If you want to import the plan again, you can use the -import parameter, though it has one weirdness—you have to specify the full path to the file, like this: powercfg –import c:\balanced.xml Using what you’ve learned, you can export each of the plans to a file, and then delete the ones you want to delete. Why Shouldn’t You Do This? Very simple. Stuff will break. On my test machine, for example, I removed all of the built-in plans, and then imported them all back in, but I’m still getting this error anytime I try to access the panel to choose what the power buttons do: There’s a lot more error messages, but I’m not going to waste your time with all of them. So if you want to delete the plans, do so at your own peril. At least you’ve been warned! Similar Articles Productive Geek Tips Learning Windows 7: Manage Power SettingsCreate a Shortcut or Hotkey to Switch Power PlansDisable Power Management on Windows 7 or VistaChange the Windows 7 or Vista Power Buttons to Shut Down/Sleep/HibernateDisable Windows Vista’s Built-in CD/DVD Burning Features TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Gadfly is a cool Twitter/Silverlight app Enable DreamScene in Windows 7 Microsoft’s “How Do I ?” Videos Home Networks – How do they look like & the problems they cause Check Your IMAP Mail Offline In Thunderbird Follow Finder Finds You Twitter Users To Follow

    Read the article

  • Lenovo X220 right click does not work with ubuntu 12.04

    - by fulop
    I am unable to right click with my new X220 Lenovo sub-notebook. I have read several workaround but even not know which one would help me. Can someone help me to find the solution or workaround? dpkg-buildpackage: export CFLAGS from dpkg-buildflags (origin: vendor): -g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Wformat-security dpkg-buildpackage: export CPPFLAGS from dpkg-buildflags (origin: vendor): -D_FORTIFY_SOURCE=2 dpkg-buildpackage: export CXXFLAGS from dpkg-buildflags (origin: vendor): -g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Wformat-security dpkg-buildpackage: export FFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export LDFLAGS from dpkg-buildflags (origin: vendor): -Wl,-Bsymbolic-functions -Wl,-z,relro dpkg-buildpackage: source package xserver-xorg-input-synaptics dpkg-buildpackage: source version 1.6.2-1ubuntu1~precise2 dpkg-buildpackage: source changed by Timo Aaltonen <[email protected]> dpkg-buildpackage: host architecture amd64 dpkg-source --before-build xserver-xorg-input-synaptics-1.6.2 fakeroot debian/rules clean dh clean --with quilt,autoreconf,xsf --builddirectory=build/ dh_testdir -O--builddirectory=build/ dh_auto_clean -O--builddirectory=build/ dh_quilt_unpatch -O--builddirectory=build/ Removing patch 131_reset-num_active_touches-on-deviceoff.patch Restoring src/synaptics.c Removing patch 130_dont_enable_rightbutton_area.patch Restoring conf/50-synaptics.conf Removing patch 129_disable_three_touch_tap.patch Restoring src/synaptics.c Removing patch 128_disable_three_click_action.patch Restoring src/synaptics.c Removing patch 126_ubuntu_xi22.patch Restoring configure.ac Removing patch 125_option_rec_revert.patch Restoring test/fake-symbols.h Restoring test/fake-symbols.c Removing patch 124_syndaemon_events.patch Restoring tools/syndaemon.c Removing patch 118_quell_error_msg.patch Restoring tools/synclient.c Restoring tools/syndaemon.c Removing patch 115_evdev_only.patch Restoring conf/50-synaptics.conf Removing patch 106_always_enable_vert_edge_scroll.patch Restoring src/synaptics.c Removing patch 104_always_enable_tapping.patch Restoring src/synaptics.c Removing patch 103_enable_cornertapping.patch Restoring src/synaptics.c Removing patch 101_resolution_detect_option.patch Restoring include/synaptics-properties.h Restoring man/synaptics.man Restoring src/synapticsstr.h Restoring src/properties.c Restoring src/synaptics.c Restoring tools/synclient.c Removing patch 02-do-not-use-synaptics-for-keyboards.patch Restoring conf/11-x11-synaptics.fdi No patches applied dh_autoreconf_clean -O--builddirectory=build/ dh_clean -O--builddirectory=build/ dpkg-source -b xserver-xorg-input-synaptics-1.6.2 dpkg-source: warning: no source format specified in debian/source/format, see dpkg-source(1) dpkg-source: info: using source format `1.0' dpkg-source: info: building xserver-xorg-input-synaptics using existing xserver-xorg-input-synaptics_1.6.2.orig.tar.gz dpkg-source: info: building xserver-xorg-input-synaptics in xserver-xorg-input-synaptics_1.6.2-1ubuntu1~precise2.diff.gz dpkg-source: warning: the diff modifies the following upstream files: autogen.sh docs/README.alps docs/tapndrag.dia docs/trouble-shooting.txt dpkg-source: info: use the '3.0 (quilt)' format to have separate and documented changes to upstream files, see dpkg-source(1) dpkg-source: info: building xserver-xorg-input-synaptics in xserver-xorg-input-synaptics_1.6.2-1ubuntu1~precise2.dsc debian/rules build dh build --with quilt,autoreconf,xsf --builddirectory=build/ dh_testdir -O--builddirectory=build/ dh_quilt_patch -O--builddirectory=build/ Applying patch 02-do-not-use-synaptics-for-keyboards.patch patching file conf/11-x11-synaptics.fdi Hunk #1 succeeded at 9 (offset 7 lines). Applying patch 101_resolution_detect_option.patch patching file include/synaptics-properties.h patching file man/synaptics.man patching file src/properties.c Hunk #3 succeeded at 787 (offset 6 lines). patching file src/synaptics.c Hunk #2 succeeded at 1403 (offset 3 lines). Hunk #3 succeeded at 1421 (offset 3 lines). patching file src/synapticsstr.h patching file tools/synclient.c Applying patch 103_enable_cornertapping.patch patching file src/synaptics.c Hunk #1 succeeded at 762 with fuzz 1 (offset 202 lines). Applying patch 104_always_enable_tapping.patch patching file src/synaptics.c Hunk #1 succeeded at 662 with fuzz 2 (offset 6 lines). Applying patch 106_always_enable_vert_edge_scroll.patch patching file src/synaptics.c Hunk #1 succeeded at 673 (offset 174 lines). Applying patch 115_evdev_only.patch patching file conf/50-synaptics.conf Hunk #1 succeeded at 14 with fuzz 2. Applying patch 118_quell_error_msg.patch patching file tools/synclient.c patching file tools/syndaemon.c Applying patch 124_syndaemon_events.patch patching file tools/syndaemon.c Applying patch 125_option_rec_revert.patch patching file test/fake-symbols.c patching file test/fake-symbols.h Applying patch 126_ubuntu_xi22.patch patching file configure.ac Applying patch 128_disable_three_click_action.patch patching file src/synaptics.c Hunk #1 succeeded at 671 (offset 174 lines). Applying patch 129_disable_three_touch_tap.patch patching file src/synaptics.c Hunk #1 succeeded at 665 (offset 32 lines). Applying patch 130_dont_enable_rightbutton_area.patch patching file conf/50-synaptics.conf Applying patch 131_reset-num_active_touches-on-deviceoff.patch patching file src/synaptics.c Applying patch 201-wait.patch patching file src/eventcomm.c Hunk #1 FAILED at 750. Hunk #2 FAILED at 775. Hunk #3 FAILED at 784. 3 out of 3 hunks FAILED -- rejects in file src/eventcomm.c Patch 201-wait.patch does not apply (enforce with -f) dh_quilt_patch: quilt --quiltrc /dev/null push -a || test $? = 2 returned exit code 1 make: *** [build] Error 25 dpkg-buildpackage: error: debian/rules build gave error exit status 2

    Read the article

  • Solaris 11.1: Encrypted Immutable Zones on (ZFS) Shared Storage

    - by darrenm
    Solaris 11 brought both ZFS encryption and the Immutable Zones feature and I've talked about the combination in the past.  Solaris 11.1 adds a fully supported method of storing zones in their own ZFS using shared storage so lets update things a little and put all three parts together. When using an iSCSI (or other supported shared storage target) for a Zone we can either let the Zones framework setup the ZFS pool or we can do it manually before hand and tell the Zones framework to use the one we made earlier.  To enable encryption we have to take the second path so that we can setup the pool with encryption before we start to install the zones on it. We start by configuring the zone and specifying an rootzpool resource: # zonecfg -z eizoss Use 'create' to begin configuring a new zone. zonecfg:eizoss> create create: Using system default template 'SYSdefault' zonecfg:eizoss> set zonepath=/zones/eizoss zonecfg:eizoss> set file-mac-profile=fixed-configuration zonecfg:eizoss> add rootzpool zonecfg:eizoss:rootzpool> add storage \ iscsi://zs7120-tvp540-c.uk.oracle.com/luname.naa.600144f09acaacd20000508e64a70001 zonecfg:eizoss:rootzpool> end zonecfg:eizoss> verify zonecfg:eizoss> commit zonecfg:eizoss> Now lets create the pool and specify encryption: # suriadm map \ iscsi://zs7120-tvp540-c.uk.oracle.com/luname.naa.600144f09acaacd20000508e64a70001 PROPERTY VALUE mapped-dev /dev/dsk/c10t600144F09ACAACD20000508E64A70001d0 # echo "zfscrypto" > /zones/p # zpool create -O encryption=on -O keysource=passphrase,file:///zones/p eizoss \ /dev/dsk/c10t600144F09ACAACD20000508E64A70001d0 # zpool export eizoss Note that the keysource example above is just for this example, realistically you should probably use an Oracle Key Manager or some other better keystorage, but that isn't the purpose of this example.  Note however that it does need to be one of file:// https:// pkcs11: and not prompt for the key location.  Also note that we exported the newly created pool.  The name we used here doesn't actually mater because it will get set properly on import anyway. So lets go ahead and do our install: zoneadm -z eizoss install -x force-zpool-import Configured zone storage resource(s) from: iscsi://zs7120-tvp540-c.uk.oracle.com/luname.naa.600144f09acaacd20000508e64a70001 Imported zone zpool: eizoss_rpool Progress being logged to /var/log/zones/zoneadm.20121029T115231Z.eizoss.install Image: Preparing at /zones/eizoss/root. AI Manifest: /tmp/manifest.xml.ujaq54 SC Profile: /usr/share/auto_install/sc_profiles/enable_sci.xml Zonename: eizoss Installation: Starting ... Creating IPS image Startup linked: 1/1 done Installing packages from: solaris origin: http://pkg.us.oracle.com/solaris/release/ Please review the licenses for the following packages post-install: consolidation/osnet/osnet-incorporation (automatically accepted, not displayed) Package licenses may be viewed using the command: pkg info --license <pkg_fmri> DOWNLOAD PKGS FILES XFER (MB) SPEED Completed 187/187 33575/33575 227.0/227.0 384k/s PHASE ITEMS Installing new actions 47449/47449 Updating package state database Done Updating image state Done Creating fast lookup database Done Installation: Succeeded Note: Man pages can be obtained by installing pkg:/system/manual done. Done: Installation completed in 929.606 seconds. Next Steps: Boot the zone, then log into the zone console (zlogin -C) to complete the configuration process. Log saved in non-global zone as /zones/eizoss/root/var/log/zones/zoneadm.20121029T115231Z.eizoss.install That was really all we had to do, when the install is done boot it up as normal. The zone administrator has no direct access to the ZFS wrapping keys used for the encrypted pool zone is stored on.  Due to how inheritance works in ZFS he can still create new encrypted datasets that use those wrapping keys (without them ever being inside a process in the zone) or he can create encrypted datasets inside the zone that use keys of his own choosing, the output below shows the two cases: rpool is inheriting the key material from the global zone (note we can see the value of the keysource property but we don't use it inside the zone nor does that path need to be (or is) accessible inside the zone). Whereas rpool/export/home/bob has set keysource locally. # zfs get encryption,keysource rpool rpool/export/home/bob NAME PROPERTY VALUE SOURCE rpool encryption on inherited from $globalzone rpool keysource passphrase,file:///zones/p inherited from $globalzone rpool/export/home/bob encryption on local rpool/export/home/bob keysource passphrase,prompt local  

    Read the article

  • Juggling with JDKs on Apple OS X

    - by Blueberry Coder
    I recently got a shiny new MacBook Pro to help me support our ADF Mobile customers. It is really a wonderful piece of hardware, although I am still adjusting to Apple's peculiar keyboard layout. Did you know, for example, that the « delete » key actually performs a « backspace »? But I disgress... As you may know, ADF Mobile development still requires JDeveloper 11gR2, which in turn runs on Java 6. On the other hand, JDeveloper 12c needs JDK 7. I wanted to install both versions, and wasn't sure how to do it.   If you remember, I explained in a previous blog entry how to install JDeveloper 11gR2 on Apple's OS X. The trick was to use the /usr/libexec/java_home command in order to invoke the proper JDK. In this case, I could have done the same thing; the two JDKs can coexist without any problems, since they install in completely different locations. But I wanted more than just installing JDeveloper. I wanted to be able to select my JDK when using the command line as well. On Windows, this is easy, since I keep all my JDKs in a central location. I simply have to move to the appropriate folder or type the folder name in the command I want to execute. Problem is, on OS X, the paths to the JDKs are... let's say convoluted.  Here is the one for Java 6. /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home The Java 7 path is not better, just different. /Library/Java/JavaVirtualMachines/jdk1.7.0_45.jdk/Contents/Home Intuitive, isn't it? Clearly, I needed something better... On OS X, the default command shell is bash. It is possible to configure the shell environment by creating a file named « .profile » in a user's home folder. Thus, I created such a file and put the following inside: export JAVA_7_HOME=$(/usr/libexec/java_home -v1.7) export JAVA_6_HOME=$(/usr/libexec/java_home -v1.6) export JAVA_HOME=$JAVA_7_HOME alias java6='export JAVA_HOME=$JAVA_6_HOME' alias java7='export JAVA_HOME=$JAVA_7_HOME'  The first two lines retrieve the current paths for Java 7 and Java 6 and store them in two environment variables. The third line marks Java 7 as the default. The last two lines create command aliases. Thus, when I type java6, the value for JAVA_HOME is set to JAVA_6_HOME, for example.  I now have an environment which works even better than the one I have on Windows, since I can change my active JDK on a whim. Here a sample, fresh from my terminal window. fdesbien-mac:~ fdesbien$ java6 fdesbien-mac:~ fdesbien$ java -version java version "1.6.0_65" Java(TM) SE Runtime Environment (build 1.6.0_65-b14-462-11M4609) Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-462, mixed mode) fdesbien-mac:~ fdesbien$ fdesbien-mac:~ fdesbien$ java7 fdesbien-mac:~ fdesbien$ java -version java version "1.7.0_45" Java(TM) SE Runtime Environment (build 1.7.0_45-b18) Java HotSpot(TM) 64-Bit Server VM (build 24.45-b08, mixed mode) fdesbien-mac:~ fdesbien$ Et voilà! Maximum flexibility without downsides, just I like it. 

    Read the article

  • Crontab: cut line to many lines?

    - by Heoa
    Hard-to-read-line @daily export sunshine="~/logs/Sunshine-`date '+\%F'`" && export sunshineUrl="http://www.sunshine.net/main/search_results.asp?currency_id=1&min_price=&max_price=50000&country_id=241&region_id=&Submit=Search" && mkdir -p $sunshine && cd $sunshine && wget --mirror -l 1 $sunshineUrl Which mark do I need to have it on many lines? @daily <SOME MARK HERE> export sunshine="~/logs/Sunshine-`date '+\%F'`" && <SOME MARK HERE> export sunshineUrl="http://www.sunshine.net/main/search_results.asp?currency_id=1&min_price=&max_price=50000&country_id=241&region_id=&Submit=Search" && <SOME MARK HERE> mkdir -p $sunshine && <SOME MARK HERE> cd $sunshine && wget --mirror -l 1 $sunshineUrl No success by appending \, //, \n or /n.

    Read the article

  • Creating a tiled map with blender

    - by JamesB
    I'm looking at creating map tiles based on a 3D model made in blender, The map is 16 x 16 in blender. I've got 4 different zoom levels and each tile is 100 x 100 pixels. The entire map at the most zoomed out level is 4 x 4 tiles constructing an image of 400 x 400. The most zoomed in level is 256 x 256 obviously constructing an image of 25600 x 25600 What I need is a script for blender that can create the tiles from the model. I've never written in python before so I've been trying to adapt a couple of the scripts which are already there. So far I've come up with a script, but it doesn't work very well. I'm having real difficulties getting the tiles to line up seamlessly. I'm not too concerned about changing the height of the camera as I can always create the same zoomed out tiles at 6400 x 6400 images and split the resulting images into the correct tiles. Here is what I've got so far... #!BPY """ Name: 'Export Map Tiles' Blender: '242' Group: 'Export' Tip: 'Export to Map' """ import Blender from Blender import Scene,sys from Blender.Scene import Render def init(): thumbsize = 200 CameraHeight = 4.4 YStart = -8 YMove = 4 XStart = -8 XMove = 4 ZoomLevel = 1 Path = "/Images/Map/" Blender.drawmap = [thumbsize,CameraHeight,YStart,YMove,XStart,XMove,ZoomLevel,Path] def show_prefs(): buttonthumbsize = Blender.Draw.Create(Blender.drawmap[0]); buttonCameraHeight = Blender.Draw.Create(Blender.drawmap[1]) buttonYStart = Blender.Draw.Create(Blender.drawmap[2]) buttonYMove = Blender.Draw.Create(Blender.drawmap[3]) buttonXStart = Blender.Draw.Create(Blender.drawmap[4]) buttonXMove = Blender.Draw.Create(Blender.drawmap[5]) buttonZoomLevel = Blender.Draw.Create(Blender.drawmap[6]) buttonPath = Blender.Draw.Create(Blender.drawmap[7]) block = [] block.append(("Image Size", buttonthumbsize, 0, 500)) block.append(("Camera Height", buttonCameraHeight, -0, 10)) block.append(("Y Start", buttonYStart, -10, 10)) block.append(("Y Move", buttonYMove, 0, 5)) block.append(("X Start", buttonXStart,-10, 10)) block.append(("X Move", buttonXMove, 0, 5)) block.append(("Zoom Level", buttonZoomLevel, 1, 10)) block.append(("Export Path", buttonPath,0,200,"The Path to save the tiles")) retval = Blender.Draw.PupBlock("Draw Map: Preferences" , block) if retval: Blender.drawmap[0] = buttonthumbsize.val Blender.drawmap[1] = buttonCameraHeight.val Blender.drawmap[2] = buttonYStart.val Blender.drawmap[3] = buttonYMove.val Blender.drawmap[4] = buttonXStart.val Blender.drawmap[5] = buttonXMove.val Blender.drawmap[6] = buttonZoomLevel.val Blender.drawmap[7] = buttonPath.val Export() def Export(): scn = Scene.GetCurrent() context = scn.getRenderingContext() def cutStr(str): #cut off path leaving name c = str.find("\\") while c != -1: c = c + 1 str = str[c:] c = str.find("\\") str = str[:-6] return str #variables from gui: thumbsize,CameraHeight,YStart,YMove,XStart,XMove,ZoomLevel,Path = Blender.drawmap XMove = XMove / ZoomLevel YMove = YMove / ZoomLevel Camera = Scene.GetCurrent().getCurrentCamera() Camera.LocZ = CameraHeight / ZoomLevel YStart = YStart + (YMove / 2) XStart = XStart + (XMove / 2) #Point it straight down Camera.RotX = 0 Camera.RotY = 0 Camera.RotZ = 0 TileCount = 4**ZoomLevel #Because the first thing we do is move the camera, start it off the map Camera.LocY = YStart - YMove for i in range(0,TileCount): Camera.LocY = Camera.LocY + YMove Camera.LocX = XStart - XMove for j in range(0,TileCount): Camera.LocX = Camera.LocX + XMove Render.EnableDispWin() context.extensions = True context.renderPath = Path #setting thumbsize context.imageSizeX(thumbsize) context.imageSizeY(thumbsize) #could be put into a gui. context.imageType = Render.PNG context.enableOversampling(0) #render context.render() #save image ZasString = '%s' %(int(ZoomLevel)) XasString = '%s' %(int(j+1)) YasString = '%s' %(int((3-i)+1)) context.saveRenderedImage("Z" + ZasString + "X" + XasString + "Y" + YasString) #close the windows Render.CloseRenderWindow() try: type(Blender.drawmap) except: #print 'initialize extern variables' init() show_prefs()

    Read the article

  • expr non-numeric argument shell script

    - by Kimi
    The below check is not working : expr non-numeric argument shell script. Always it is going to else. Please tell me what is the mistake. Earlier I was no using while so the same thing was woring fine now suddenly when I did put it in the while loop it is no working. echo "`${BOLD}` ***** Checking Memory Utilization User*****`${UNBOLD}`" echo "===================================================" IFS='|' cat configMachineDetails.txt | grep -v "^#" | while read MachineType UserName MachineName do export MEMORY_USAGE1=`ssh -f -T ${UserName}@${MachineName} prstat -t -s rss 1 2 | tr '%' ' '| awk '$5>5.0'` export LEN=`echo "$MEMORY_USAGE1"|wc -l` export CNPROC=`echo "$MEMORY_USAGE1"|grep "NPROC"|wc -l` export CTotal=`echo "$MEMORY_USAGE1"|grep "Total"|wc -l` **if [ $LEN = `expr $CNPROC + $CTotal` ] then echo "`${BOLD}`**************All usages are normal !!!!!! *************`${UNBOLD}`" else echo "`${BOLD}`**** Memory(%) is more than 5% in MachineType $MachineType UserName $UserName MachineName $MachineName *******`${UNBOLD}`" echo "====================================================" echo "$MEMORY_USAGE1" fi** done

    Read the article

  • Using a PivotTable to Count Items in Access

    - by Sandra
    I have a list of text entries and I want to count how often each entry appears in the list. e.g. Berlin Paris London London Paris Paris Paris The result would be Berlin 1 Paris 4 London 2 This result easy do to achieve with an pivot table in MS Excel (see: Count Items in Excel). My data not in spreadsheet in Excel but in a MS Access database table. So in order to avoid constant switching between Access and Excel and I would like to handle everything in Access (either Access 2007 or 2010). I know there are pivot tables in Access and I know how to display one, but I was unable to find out how to count the number of occurrences. Thank you!

    Read the article

  • JUnit: 4.8.1 "Could not find class"

    - by Patrick
    Ok, I am like other and new to jUnit and having a difficult time trying to get it working. I have searched the forum but the answers provided; I am just not getting. If anyone out there could lend me a hand I would greatly appreciate it. Let me provide the basics: OS: mac OS X.6 export JUNIT_HOME="/Developer/junit/junit4.8.1" export CVSROOT="/opt/cvsroot" export PATH="/usr/local/bin:/usr/local/sbin:/usr/localmysql/bin:/opt/PalmSDK/Current/bin/:/usr/local/mysql/bin:$PATH:$JUNIT_HOME:$CVSROOT" export CLASSPATH="$CLASSPATH:$JUNIT_HOME/junit-4.8.1.jar:$JUNIT_HOME" I can compile a test class from a java file, however when I try to then run the test java org.junit.runner.JUnitCore MyTest.class I get the following: JUnit version 4.8.1 Could not find class: MyTest.class Time: 0.001 OK (0 tests) Now I have been in the directory with the MyTest.class which is just somewhere in my file system, I tried moving the source folder to the "junit" folder and the "junit/junit4.8.1" folder and the same result. I cannot even run the tests that came with junit. Thanks patrick

    Read the article

< Previous Page | 131 132 133 134 135 136 137 138 139 140 141 142  | Next Page >