Although some folks had attributed a problem with running PHP script prepared in TextEdit, I switched to GEdit, which comes with the software suite in Ubuntu and I still had no luck. I was aware that TextEdit had a nasty habit of adding the ".rtf" extension to all files, but such was not the case with GEdit. I have no idea where I've gone wrong. All of this material is stuff I wanted to master and I would be satisfied even if I didn't master it so long as I could get things to run and I had a pretty good run of "luck" or "ability to follow *some* instructions" before hitting the PHP section. I do have a couple weeks over break to review everything and attempt to do the LAMP part over on my Dell laptop, running Windows. They always say that the importance of an experiment is the ability to replicate it. I will be the first to admit that there were several times this semester where I think I finally got stuff to work on a fluke. Looking back at the form of documentation I created in terms of screenshots, there were a lot of shots done when I was going well and few when I came up against challenges. I have indeed taken steps to get away from earlier, long held reactions of getting emotional when I ought to double down on being rational. The topic I would most like to explore is where on earth did I go wrong with this PHP stuff? What is it about this particular script (I have never seen before) that made it such a problem?
Still, I feel like I went far in actually using Linux, opening directories, apt-getting applications and running them, navigating with some degree of dexterity I have never had. Now on to the final and off to work.
Wednesday, December 10, 2014
Tuesday, December 9, 2014
Unit 13: Hitting a brick wall
During our work in Excel at the end of November, I had to install Yosemite to update my OS in order to run the most current versions of Office and Sigil. If I load up Yosemite, I will lose a ton of old (not free) software I use very frequently. I'm going to have to buy a whole bunch of new software to update those apps (Adobe Creative Suite, which I have delayed in updating since the end of my design career, but that I use on an almost weekly basis and I won't do the subscription versions). I'm not really budgeted to make a major software purchase until January. This is an alarming situation. I don't know what I can do for a workaround.
Regarding coding, I definitely want to learn more about PHP and using it to query our database and get it to put out answers was a very practical project. I definitely want to tame if not master PHP. So far all I've done is PHP in this class and NetBeans which seems to be Java-based, but man, generally with coding you have to cross your t's and dot your i's and if you make the slightest error, it won't run. That said, when you can compile code and it works, it's rewarding. Then you can copy and paste parts of code you know works and customize it here and there.
I was really impressed with the Excel providing those customized lines of code for each ISBN number and if my paste special function were working the right way, I could do it automatically.
Since writing and compiling code doesn't happen in a WYSIWYG environment, (like html with a nice editor where you can just about see your changes on the fly) you might be mystified for a while before identifying where you made a mistake. Yeah, a compiler is a crazy and very opaque black box and you realize how instantaneous feedback can propel a project. When you can see your changes in something like real time you can really move quickly and without that quick feedback from the system you're kind of in the dark.
Regarding coding, I definitely want to learn more about PHP and using it to query our database and get it to put out answers was a very practical project. I definitely want to tame if not master PHP. So far all I've done is PHP in this class and NetBeans which seems to be Java-based, but man, generally with coding you have to cross your t's and dot your i's and if you make the slightest error, it won't run. That said, when you can compile code and it works, it's rewarding. Then you can copy and paste parts of code you know works and customize it here and there.
I was really impressed with the Excel providing those customized lines of code for each ISBN number and if my paste special function were working the right way, I could do it automatically.
Since writing and compiling code doesn't happen in a WYSIWYG environment, (like html with a nice editor where you can just about see your changes on the fly) you might be mystified for a while before identifying where you made a mistake. Yeah, a compiler is a crazy and very opaque black box and you realize how instantaneous feedback can propel a project. When you can see your changes in something like real time you can really move quickly and without that quick feedback from the system you're kind of in the dark.
Monday, November 17, 2014
Unit 12 Project Management
The readings by Mark Keil was very interesting, to say the least. He talks about a software project that took up the better part of a decade and was not completely functional before finally getting the ax. Due to project factors, psychological factors, social factors and organizational factors there were many reasons why this project was allowed to linger on. I guess a very large part of this was that the team involved and its leader had developed many good products in the past before attempting to roll out the CONFIG project. CONFIG was supposed to assist sales people in making deals with preferred clients, which can be tricky as sales folk do like to throw around discounts for preferred clients but they definitely need a range to work in so as not to undersell the company (and their own commissions).
Probably the biggest problem this writer can see with the CONFIG project was that it was not interoperable with the same company's other successful and well accepted system, the Price Quotation System (PQS). I must point out that since this project was happening in the 1980s, maybe software developers were not as attuned to the notion of developing software in interoperable suites. But now we have a very different business model emerging regarding software versus the 80s when a standalone might have had some appeal, but I don't think a standalone that was incapable of interoperability with the same company's products might have been a good idea. Nevertheless the company in the study was blindsided by their own track record of success.
As best as I understand, the current ideal model in software development and deployment is not creating an application that the developer sells like books or magazines to customers, it is creating an online web service where users or subscribers can access a whole suite of tools which are continuously upgraded as a web service. In terms of industrial models this is about the difference between designing and manufacturing a locomotive and developing and managing an entire railroad. So the current model of web services is far more vast and encompassing than the old model of software publishing.
So according to Keil, part of the reason for failure of the CONFIG project was that the team involved and its leader had a solid track record of success with the company. As development of CONFIG got bogged down, the team and the company threw more money at the project, something Keil calls project escalation, but nowadays might be called "doubling down," which may or may not be related to "doubling down on a busted flush."
Also the CONFIG project manager, Tom Jones, was very popular within the company and had a fantastic reputation for success, so the company was able to procure the resources Jones requested. The company also thought that if CONFIG ever went live, there would be a huge payoff, plus it had already sunk plenty of resources into R&D, so maybe a little more effort would push the project to completion and its payday. The notion of pulling the plug on the project when it seemed "so close to completion" seemed like dumping so much investment down the drain at the cusp of success.
It was only after two huge blows came to the company that management reviewed and reconsidered the CONFIG project. These were the death of the project manager, Tom Jones and a huge downturn in the software market at the end of the 1980s. Only then, after the better part of a decade and countless (mythical) "man-months" had been expended did the tap of resources get turned off.
Other good readings on project management were all of the assigned Frank Cervone articles, as he seems adept at project management but also goes out of his way to make his advice relevant to librarians. Cervone has developed a formula for risk assessment which is weighted by the degree of criticality of the function that would be lost should said misfortune strike as well as the actual likelihood of such disaster striking, this was entirely novel to this reader. Cervone stresses that the best risk avoidance strategy is a high degree of communication throughout the project team and the organization, something I can attest to based on my own experience in software development projects. He couples this with utilizing a flexible model (i.e. anything other than the traditional dependency-heavy "Pipeline" model). Some of Cervone's alternative models include the spiral model and iterative prototyping. Cervone's continual use of examples from his many library projects adds more validity to his articles as well.
I think the instructor had a few words on the nature of project management being no walk in the park and from the projects I have worked on, there almost always seems to be some set of problems that can never be foreseen. However, having a plan, especially one that can be modified within reason is a critical part of the puzzle. It is clear that the traditional "Pipeline" project management model is not flexible enough for the contingencies (and client/project owner deliberation and changing specs on the fly) so a number of recent models have come out, all of them seem to be variations on a flexible model of pipeline where there is some degree of managing both the many dependencies of programming as well as accommodating the vagaries of the client. Waterfall seems a good way to manage the dependencies, agile techniques like spiral, XP and iterative prototyping seem to do well in handling ever-changing client specifications. Let's just say that the "clean room" model can never be used in any project where the client can change their mind after the project has commenced. A whole batch of permutations between waterfall and more agile models seems to be the way that project managers and project management theorists have dealt with both factors, but no one model seems to have become dominant.
Probably the biggest problem this writer can see with the CONFIG project was that it was not interoperable with the same company's other successful and well accepted system, the Price Quotation System (PQS). I must point out that since this project was happening in the 1980s, maybe software developers were not as attuned to the notion of developing software in interoperable suites. But now we have a very different business model emerging regarding software versus the 80s when a standalone might have had some appeal, but I don't think a standalone that was incapable of interoperability with the same company's products might have been a good idea. Nevertheless the company in the study was blindsided by their own track record of success.
As best as I understand, the current ideal model in software development and deployment is not creating an application that the developer sells like books or magazines to customers, it is creating an online web service where users or subscribers can access a whole suite of tools which are continuously upgraded as a web service. In terms of industrial models this is about the difference between designing and manufacturing a locomotive and developing and managing an entire railroad. So the current model of web services is far more vast and encompassing than the old model of software publishing.
So according to Keil, part of the reason for failure of the CONFIG project was that the team involved and its leader had a solid track record of success with the company. As development of CONFIG got bogged down, the team and the company threw more money at the project, something Keil calls project escalation, but nowadays might be called "doubling down," which may or may not be related to "doubling down on a busted flush."
Also the CONFIG project manager, Tom Jones, was very popular within the company and had a fantastic reputation for success, so the company was able to procure the resources Jones requested. The company also thought that if CONFIG ever went live, there would be a huge payoff, plus it had already sunk plenty of resources into R&D, so maybe a little more effort would push the project to completion and its payday. The notion of pulling the plug on the project when it seemed "so close to completion" seemed like dumping so much investment down the drain at the cusp of success.
It was only after two huge blows came to the company that management reviewed and reconsidered the CONFIG project. These were the death of the project manager, Tom Jones and a huge downturn in the software market at the end of the 1980s. Only then, after the better part of a decade and countless (mythical) "man-months" had been expended did the tap of resources get turned off.
Other good readings on project management were all of the assigned Frank Cervone articles, as he seems adept at project management but also goes out of his way to make his advice relevant to librarians. Cervone has developed a formula for risk assessment which is weighted by the degree of criticality of the function that would be lost should said misfortune strike as well as the actual likelihood of such disaster striking, this was entirely novel to this reader. Cervone stresses that the best risk avoidance strategy is a high degree of communication throughout the project team and the organization, something I can attest to based on my own experience in software development projects. He couples this with utilizing a flexible model (i.e. anything other than the traditional dependency-heavy "Pipeline" model). Some of Cervone's alternative models include the spiral model and iterative prototyping. Cervone's continual use of examples from his many library projects adds more validity to his articles as well.
I think the instructor had a few words on the nature of project management being no walk in the park and from the projects I have worked on, there almost always seems to be some set of problems that can never be foreseen. However, having a plan, especially one that can be modified within reason is a critical part of the puzzle. It is clear that the traditional "Pipeline" project management model is not flexible enough for the contingencies (and client/project owner deliberation and changing specs on the fly) so a number of recent models have come out, all of them seem to be variations on a flexible model of pipeline where there is some degree of managing both the many dependencies of programming as well as accommodating the vagaries of the client. Waterfall seems a good way to manage the dependencies, agile techniques like spiral, XP and iterative prototyping seem to do well in handling ever-changing client specifications. Let's just say that the "clean room" model can never be used in any project where the client can change their mind after the project has commenced. A whole batch of permutations between waterfall and more agile models seems to be the way that project managers and project management theorists have dealt with both factors, but no one model seems to have become dominant.
Tuesday, November 11, 2014
Unit 11: Trial by ordeal
When I began this course, I was allergic to the command line. Okay, well that's what I would tell people so I could avoid it, but even then I wanted a degree of familiarity with the shell whereby I could at least navigate around to where I needed to go, open directories, run executables. I didn't even understand what configuring meant before starting this class actually I hardly knew what the LAMP server setup was about when I read online materials for the DigIn certificate about four years ago while investigating SIRLS. But it seemed impressive as an accomplishment. What's funny is when I first moved to Tucson in 2007 my immediate goal was to do a series of Flash projects thinking that there was a future in Flash, but a series of jobs involving large collections of paper and digital items and a series of questions about production bottlenecks led me to thinking long and hard about enrolling in a program in the technological aspects of library studies. I was very fortunate to have a very good school in that field right here in Tucson, a city that frequently underwhelms me in most fields of endeavor not related to the U of A.
As always, the conceptual stuff was easier for me, but the practical part is still really difficult and frustrating, but at least the frustration isn't irrational or completely emotional like it was before. I have now done things in the CL environment, had I had those skills two years ago I would have gotten an IT trainee position in the same library where I am now just a late-night info desk guy. I think my interest in MySQL and my desire to learn whatever I can about databases has motivated a deeper interest in wanting to get better at using Linux and comprehending php. I guess I needed a functional model in my mind. Previous experience with the CL left me wanting to avoid it at all costs, but I knew I'd have to use it if I wanted to use the remaining components of the LAMP stack. Another thing that was really useful for me understanding how the LAMP configuration work in the framework of the dynamic web (a.k.a. Web 2.0) was the first 8 and a half minutes of Prof. Fulton describing it in a video in my IRLS 504 core class that gelled it in my mind. in the run of this class I finally got a laptop, I'm going to have to invest in a second download of Ubuntu for that machine and try to set this LAMP stack up on that machine to be able to work on it in a more flexible fashion (like anytime after 8pm) than my current setup of a desktop in a room that has been taken over by my toddler child. Yeah, that would have been a much better situation than what this has shaken out so far, but live and learn, I guess. But if I can create a database like the photographer one, but for my own purposes it would all be very worthwhile.
As always, the conceptual stuff was easier for me, but the practical part is still really difficult and frustrating, but at least the frustration isn't irrational or completely emotional like it was before. I have now done things in the CL environment, had I had those skills two years ago I would have gotten an IT trainee position in the same library where I am now just a late-night info desk guy. I think my interest in MySQL and my desire to learn whatever I can about databases has motivated a deeper interest in wanting to get better at using Linux and comprehending php. I guess I needed a functional model in my mind. Previous experience with the CL left me wanting to avoid it at all costs, but I knew I'd have to use it if I wanted to use the remaining components of the LAMP stack. Another thing that was really useful for me understanding how the LAMP configuration work in the framework of the dynamic web (a.k.a. Web 2.0) was the first 8 and a half minutes of Prof. Fulton describing it in a video in my IRLS 504 core class that gelled it in my mind. in the run of this class I finally got a laptop, I'm going to have to invest in a second download of Ubuntu for that machine and try to set this LAMP stack up on that machine to be able to work on it in a more flexible fashion (like anytime after 8pm) than my current setup of a desktop in a room that has been taken over by my toddler child. Yeah, that would have been a much better situation than what this has shaken out so far, but live and learn, I guess. But if I can create a database like the photographer one, but for my own purposes it would all be very worthwhile.
Tuesday, November 4, 2014
Unit 10: Databases P.2 (Electric Bugaloo)
SQL seemed a lot easier to learn than Linux, its syntax is more like the kinds of things human beings say to one another (human readable) for the most part. Last week I committed an EPIC FAIL in posting some tables which were extremely flawed because I had accessed the Mostafa tutorials by means of googling Mostafa MySQL instead of accessing it through UACBT/VTC and doing it through GOogle was not the same as doing it through the VPN, so I was unable to download his movies after Section 2. This week I viewed all of last week's Mostafa videos and learned that I had not really normalized my data correctly, so it's gonna be the pre-fab images folder all the way for me, I guess. The data set I had in mind was a lot less complicated than what we have here and I couldn't really figure out a Primary Key for that. If this was the first time I discovered something I posted was completely messed up I wouldn't mention it, but now I will have a trail of online posts that make me look like an idiot, only a week after I posted them. Good thing the internet is so malleable that nobody will ever see that ;)
So the hardest concept for this week was the table joins, I think you join tables in expanding "your net" in looking for query results? Mr. Mostafa just about lost me when he started using single letter abbreviations for aliases in his commands.
In answer to a question in the assignments, I have a hunch it would be easy to transcribe requests to edit data in Webmin into the MySQL command line because Webmin seems to replicate in both syntax and semantics MySQL commands.
Hm, one real challenge for me was attempting to summon the Webmin, this whole thing of starting up Webmin via the command line and then firing it up in a web browser can make your head spin from time to time, but we did it a couple of times and the second time I was able to do it successfully just from my notes and no needing to google it. Also when starting Webmin there was some message in the code about "No super cow powers" I would like to find out more about that (You can read about it here, I guess it's Ubuntu/apt versus aptitude thing , not Webmin that is the source of super cow powers:
http://unix.stackexchange.com/questions/92185/whats-the-story-behind-super-cow-powers
).
Oh man, you can tell I've been working with the command line in SQL too long when I make my end parentheses on another line.
I am realizing sometimes it is easier for me to "get" things in review than when say, Joshua Mostafa is lecturing about it on the first go 'round. Like when he fires up MySQL in subsequent videos, I was able to write down the commands he used more easily than when he introduced those commands.
So the hardest concept for this week was the table joins, I think you join tables in expanding "your net" in looking for query results? Mr. Mostafa just about lost me when he started using single letter abbreviations for aliases in his commands.
In answer to a question in the assignments, I have a hunch it would be easy to transcribe requests to edit data in Webmin into the MySQL command line because Webmin seems to replicate in both syntax and semantics MySQL commands.
Hm, one real challenge for me was attempting to summon the Webmin, this whole thing of starting up Webmin via the command line and then firing it up in a web browser can make your head spin from time to time, but we did it a couple of times and the second time I was able to do it successfully just from my notes and no needing to google it. Also when starting Webmin there was some message in the code about "No super cow powers" I would like to find out more about that (You can read about it here, I guess it's Ubuntu/apt versus aptitude thing , not Webmin that is the source of super cow powers:
http://unix.stackexchange.com/questions/92185/whats-the-story-behind-super-cow-powers
).
Oh man, you can tell I've been working with the command line in SQL too long when I make my end parentheses on another line.
I am realizing sometimes it is easier for me to "get" things in review than when say, Joshua Mostafa is lecturing about it on the first go 'round. Like when he fires up MySQL in subsequent videos, I was able to write down the commands he used more easily than when he introduced those commands.
Tuesday, October 28, 2014
Unit 9: Databases Part I
The most difficult part for me this week was learning the
system of notation in the ERDs and trying to develop a sensible ERD for my own
database. That said, I’m really, really excited to learn about databases and
relational databases which I first heard about in the summer of 2006 but had no real idea
about until recently. At that time I was working for a plastic surgeon’s office populating a
Cumulus visual database with before and after pictures of his variety of
procedures (URL available upon request). I think the manager was going to
create this database and administer the website with Drupal, another
application I am just now getting to know in a functional manner.
A few weeks ago while emailing a friend who is a retired
programmer (retired at 33 because he made a fortune) and he told me “It’s all
about the db's, man!” I am realizing that yes, it is all about the db's and pretty much any web
service or social media site is going to be entirely driven by users accessing
servers which call up content from databases (and I guess the content in the database is
managed by a CMS). So this is the stuff we will really want to get to know if
we are going to be useful. So that's a real motivator.
Oh another hard part was figuring out where I had downloaded
MySQL because I now have three servers in Ubuntu-land and had run the apt-get
command from the virtual server, but for some reason the application downloaded
to Virtual Machine #1.
No wait, the hardest part and something I'm not sure if I have fully comprehended this correctly was the Third Normal Form. I get the first two forms, and I comprehend the concatenation thing of folding two rows into a single entity. However, it seems that a lot of people have been trying to explain a lot about databases because you can find a LOT of clips on Youtube attempting to explain parts of databases. Oh yeah, I worked as a contractor as a project coordinator at Oracle from November of 2006 to February of 2007, I would have hoped to have picked up database knowledge by osmosis, but I guess that doesn't happen, not even if you've seen Larry's helicopter land on the campus. And no, they didn't play "Ride of the Valkyries" over the PA while that happened. However I stand firm, the Third Normal Form was the hardest thing for me. Maybe Fred Coulson can set it to music? Well I just ran a search at Youtube and got over 10 pages of results for "Third Normal Form" and most of them actually do look like they are referring to RDBMS's, (as opposed to being about dogs on skateboards or cats on pianos) so I know what I'll be doing tonight...
Tuesday, October 21, 2014
Unit 8: Technology Planning
This
week’s readings were broad and varied widely. Some of the subject matter was
familiar to me having taken IRLS 674 Managing in the Digital Environment, so
Don Sager’s article on Environmental Scanning looked a lot like the SWOT
analysis material covered in 674. Michael Stephens’ article on Technoplans vs.
Technolust seemed sensible and although technolust got equal billing, it only
accounted for a small portion of the reading, which was actually good because
technolust, although it is a real trap organizations with limited budgets or
limited needs can fall into, is a minor problem in the big scheme of things.
The
Bertot reading on federal funding for library tech upgrades was really
informative and moved the agenda into real nuts and bolts as to how LSTA funds
are disbursed through state libraries (whose existence was heretofore unknown
to me) and local libraries submit their technology plans to United Services
Administration Company and how libraries are currently dealing with the LSTA
program.
In
Whittaker and company’s “What Went Wrong?...” we got an analysis from a
business consultant regarding technology projects that don’t make the final cut
and why they failed. Her analysis seemed to ring true with my own experiences
where she sees project failure happening due to poor planning, weak relevance
of the change to business mission, and lack of management support. I honestly
think that her assessment of 31% failure rate might be a little on the
charitable side, but when technology changes go wrong, they are pretty public,
but I’m sure that there are many smaller technology changes which fail but can
get swept under the rug. Otherwise, a lot of what Whittaker et al say seems
right on the money. The excuses of unrealistic planning seem valid including
underestimating training time for new tech as well as her focus on undelivered
products from third party vendors, a huge gripe in some projects I have worked on.
Gwen
Gregory’s “From Construction to Technology” article was another good nuts and
bolts summary about how LSTA affects libraries and how LSTA differs from the
previous LSCA. It seemed helpful when I read it but doesn’t really stand out a
couple days later.
Eric
Chabrow’s State of the Union was focused on organizations which had done
technology upgrades and had suffered for their efforts. It was an easier read
than the “What went Wrong” article by Whittaker, but was in the same vein, but
focused on government agencies in the “Security Sector." If your job is hunting bank robbers or tracking terrorists or making
sure corporate malefactors pay their fair share of taxes, having to deal with
technology issues or a rough transfer in technology can only be an additional
burden to a tough job. Chabrow makes a couple of really good points about
business, technology and government. In the enterprise sector, if a transition
seems to be failing, a manager will pull the plug and probably do so at the
first sign of trouble. In government, where results are not as accountable to
“audit culture” a project that is flagging can be kept on life support
indefinitely. Chabrow’s great advice is, “If you’re going to fail, fail fast”
i.e. don’t prolong the agony, just pull the plug.
Chabrow
also takes a good look at what projects in peril can do regarding the toll they
can take on management, he points out that rough business transitions can lead
to a huge out-migration of staff and management, but he also points out that a
project taken up by several managers (who have to be brought up to speed after
the project has begun) can be a kiss of death to projects that might have
succeeded had the manager initiating them stayed on. I can tell you from
experience, having a new manager come in, one who does not know the daily
problems of a department will have a hard time developing credibility with his
staff. Chabrow also scores in talking about issues with third party contractors
in projects and the frictions that will arise from having a project run by
teams with differing perspectives and differing needs. As far as articles on
planning technological change and organizational change, the Chabrow reading
was the best.
In
Dugan’s Information Technology Plans, we get down to the nitty-gritty of
actually writing a technology plan, the kinds of evaluation a library will need
to do (including assessment of the environment as Sager pointed out) to get
that technology plan together. His best piece of advice: “A question that
should be continuously answered is: why is information technology necessary to
fulfill this need? Each response should be outcomes based.”
The
OCLC reading on Environmental Scanning could have been put higher in the stack
(at an earlier position) because many main points in it were already made but
it was also showing its age as being created at the beginning of the Web 2.0
era, but it was right on when talking about the trends of self-service in the
library, patrons being satisfied with less if they were not aware of other
information options that a librarian could provide them and most of all the
trend toward a seamless information environment (although in the 11 years since
this scan was made, some libraries have caught on to the usefulness of social
media).
The
Gerding and MacKellar piece is probably one of the most practical pieces from
this unit. It made the best argument for a library having a technology plan and
also seemed to guide the reader through all the steps to getting a modern
conduit of funding for a library to acquire the technology to reach its goals.
If I knew someone from a library or cultural memory institution that was up and
coming and looking for a way to get their technology plan initiated, I would
probably recommend that they read this article first. Some of the great advice
offered here includes organizing collaborative efforts with like-minded
organizations as funders take grant proposals with partners more seriously and
that having a technology plan in place provides potential funders with proof
that the organization seeking technology funding is serious with concrete plans
and funders will react positively to organizations which have already
determined how they will use their technology. The Gerding and MacKellar
article then describes the kinds of technologies that were trending wen it was
written, varieties of grants available from the Institute for Museum and
Library Services have made available through state libraries and summarizes
with success stories of libraries and a dense distillation of tips for success.
Friday, October 10, 2014
Unit 7 Web servers 2 and more of the networked world
XML and me, Could it be love?
OK, now I am seeing why there has been so much emphasis on keeping tags well nested and giving tags closing tags and the requirement of quotes on attributes as I was learning updated html. This stuff didn't seem as important back in my BBEdit4.0 early days of coding by hand. But with the rise of CSS and XML these things are now ironclad laws. In fact my boredom with html over the past 16 years just might give way to a budding new interest.My wife loves to make analogies between anything she's talking about and dating, i.e. how looking for jobs is like dating, how choosing classes for next semester is like dating. I'd like to give my own analogy: me acquiring new software languages and dating. Html was like a high school affair: learning all the ins and outs what you should and shouldn't do, but carefree and consequence free. Linux/ssh has been like a college relationship, dark, cold, unresponsive and requiring that already I know the rules of the game really well or knowing all the commands already if you expect to get anywhere. Yeah, that relationship went nowhere. But I've heard so many good things about XML for so long, that XML could handle all my data sorting needs, that XML has extensible tags that are human readable! It's like that early stage of a crush when you can't stop listening to Hard Day's Night-era love songs over and over again. I think when the reading mentioned that XML has lots of uses for librarians I woke up and took notice.
I'm also realizing that a few humorous things in blogs that I thought were mock-html just may have been actual XML and that what I was calling "pseudo-html" in terms of the markup for writing a Wikipedia entry might very well be XML.
Before I began taking DigIn or SIRLS classes, I had built up a colossal file of reference images and wanted to develop a systematic means of attributing data to these images, so XML might be the way to go with that, i.e. it could be a good school project in a nice sandbox environment and beneficial to my outside interest to create this for that database.
So, I watched the Mark Long videos, that guy is a cut up. I guess after that, James Pasley, the guy with a thick Irish brogue was not as helpful. The words from the W3 Consortium were a good distillation of the Mark Long videos, but it helped to watch the videos before reviewing the W3C's notes. THe best note "XML doesn't DO anything!"
I realize that XML must run in browsers and not in any specific XML application. I'm also thinking that last week's question about web applications might have led me to wonder how those servers query their databases and I am now assuming that it with XML. I am also curious about XSL and XSLT and how they are used.
Tuesday, October 7, 2014
Unit 6 Web Servers Pt.1
I learned html in 1998 while attending a fly-by-night trade/night school for graphic design in San Francisco called Platt College. It was about a year and a half between me graduating and me getting a job doing graphic design at Kinko's in San Francisco's Financial District and just four months between there and me getting the job at National Geographic. Back then you wrote up html by hand in TextEdit and put it into BBEdit 4.0 which ran html4.
In the summer of 2005, after getting laid off from NGMaps, I acquired Dreamweaver and put together a very nice online portfolio that got me agency gigs. This semester, I have been taking IRLS475/575 with Martin Fricke and part of the workload has been taking online html courses in Codecademy. Codecademy was a good refresher in nesting my tags correctly, in making a table by hand and still keeping my tags correctly nested. Keeping one's tags neatly and correctly nested is really important. I had forgotten how to have an image map and make the images clickable by hand before last week.
Many of the tags seem to have changed since 1998, but I was game for setting up a one row, three column table by hand. This entailed typing everything out in my old friend TextEdit. This went nicely and I was able to keep my tags neatly nested by creating the beginning and end tags together and placing content within them.
When I realized I might want the text to line up with the pictures, I realized i was going to have to make the table two rows and three columns. Time being in short supply, I did this part in Dreamweaver, and maybe I should have done the whole project in Dreamweaver. I'm old, I think that there's some kind of "character" in doing stuff by hand instead of in an automated manner, but with the calendar showing "Oct.7" I didn't have time for keeping all my tags in order.
While trying to configure the fixed IP today, I wasted a lot of time trying to figure out what went wrong when it dawned on me that I had typed in "netmaask" instead of "netmask." I guess haste makes waste. It's these tiny details that must be adhered to intensively in every aspect of technology. And that reminds me of my days of BBEdit when I had a whole page which would not load because I typed a comma instead of period and had to comb through all that code, in a magnified form (so I could actually read the punctuation) before I was able to figure out what the problem was.
U System was actually more of a challenge compared to setting up my webpage. I was banging my head against a brick wall for a while before my wife pointed out that I had gotten disconnected from the VPN. I started up the Cisco and fired up the FileZilla and we were cooking with gas!
I will be interested in knowing how html coding can unlock all of the back-end features of html5. My website seems to have some em or bold tags gone awry, but I feel that had I continued working on it, I would not get it done today.
Tuesday, September 30, 2014
Unit 5: The Networked Environment
Overwhelmed?
I think we had a lot of material to cover in Unit 5. I was familiar with the concept of the OSI model of internet as Seven-Layer Burrito, but there was a lot of information just to act as foundation for the things we needed to learn. I appreciated Warriors of the Net (although I still have no idea what the "Death Ping" could be). As I have mentioned elsewhere, metaphors and analogies go a long way for me to conceptualize processes or things like how the Internet works.I really liked the clip about Bob Metcalf, founder of 3Com. I was really overwhelmed with many of the wiki articles. When wiki articles about technology do overwhelm me I have been known to "translate" them into "Plain English" however a lot gets lost, but sometimes I need to know functions and not profound technical details. But "Plain English" will get me to the kernel of the matter. The assigned reading by Nemeth about TCP/IP was completely daunting but I understand why UDP might be preferable for things like a Skype session instead of TCP/IP. The reading on name resolution on the LAN was where things really broke down for me in terms of doing well in comprehending things And then it was really rough going. I am having a difficult time discerning what parts are just background and what we will need to know for quizzes and what will be critically important to know once I am looking for work. However, I think I understand the that the Bridged mode is a means of bypassing one's computer and just having your virtual machine access your router.
I really liked the material we had about the Plain Old Telephone System (POTS), again it is really useful as a model for understanding and contrasting with the Internet, plus I did some cable pulling in the early to mid 90s, so I have run (and terminated) a lot of CAT5 cable and have even seen the ancient twisted pairs. Oh and also my dad was an engineer for Ma Bell for about three decades, so it's one of the few safe topics with which I can talk to him. Physical connections (Layer 1, the physical layer of the OSI model) of copper wires make a lot more sense to me as a physical model, but it can serve as a conceptual model for the later systems of information transfer. I guess things started to make sense for me when I realized a quarter inch phone connector was the same that you use to plug a guitar into an amplifier. It wasn't until I had left that field that I found out TIP="tip is positive" and RING="ring is negative." So when it comes to plugging things together and running them, I have no problem, RCA cables, phone connector cords, heavy duty audio cable, or USB cables.
Regarding learning style, I guess I am primarily an auditory learner (50%) followed by the visual (40%) with some tactile (10%). This might help why I really appreciate classroom learning as well as the kinds of YouTube videos instructors post of themselves demonstrating how a task is done. I took a test to determine my learning style at educationplanner.org but honestly, I doubt the validity and methods of any online test to determine anything of importance. However I agree that the auditory portion is crucial to me. I have no idea if this online test even checks for social versus solitary learning, which is another aspect I think is important. I'm realizing that there are several competing schools of thought regarding learning styles including those who find the whole notion of "learning styles" to be bunkum. The primary argument is that breaking learning down into sensory components ignores that we learn through our entire sensorium, not just seeing, hearing or touching.
When it comes to computer learning, again, I like the YouTube videos that go through each and every step. I anxiously take lots and lots of notes, frequently pausing the videos and scrubbing the playback head to the previous slide to take note of EVERY DETAIL, EVERY CHECK IN EVERY BOX and the correct answers to EVERY FIELD. Oftentimes my notes seem useless in hindsight, however.
For the readings, I make it a point to print out every reading (under 50 pages) and really, really mark them up, both with a highlighter pen and a ballpoint pen. I find this helps me retain things and distill it to manageable portions. I also print out PPT slides if they are available if it's something I'm not particularly familiar with. In fact, I have been known to take screenshots and print out the PowerPoint slides if they are not available.
Anyhow, I actually have no problems with written text. Some bitter, mean old commentators have said that the current generation of "digital natives" are allergic to the written word, I'm just thinking that to them it's not as exciting as the many newer media available to them. But I prefer print. Before I started taking classes in SIRLS, I was really getting into Gibbon's immortal and monumental Decline and Fall of the Roman Empire (despite his tendency toward overly attenuated prose) and once I am done with my MLIS, I really, really, really want to read Tolstoy's War and Peace, mostly to read the history of Napoleon's invasion of Russia and to find out what happened to the Rostov family. I have no problems with books and print and reading. I appreciate print as a medium but the dynamism of the net where reading is still the most substantial mode of learning. One could call the Net "enhanced reading" or I guess they already call it "hypertext." But that said, receiving instruction in installing, configuring or adding users in Linux in print make me want to ask questions, lots and lots of questions plus I get easily frustrated with technology issues and never know if my problems are unique or just something that happens when you are running Ubuntu on a Macintosh. Finally there is the complete lack of feedback in the shell which is terrifying especially in a learning environment where I wish every action I make got some feedback.
Thursday, September 18, 2014
UNIT 4: A New Hope?
OMG: after 22 years of fear and loathing for the command line environment*, I think this week's assignments were actually pretty straightforward!!!!!!!! For once all I needed to do was just follow the instructions exactly as written (this does not insure there will be no typos, however) and go along with it. If the remaining units of this class feature so few unpleasant surprises, I can stop shaking at the mere thought of working in Linux and relax and learn! Of course the first new user I added was "newuser" but we are going straight for functionality and not aesthetics in this class. But yeah, watch out for typos, Doug.
Have I mentioned that I work well with downloading all the assignments, printing them out and keeping them on hand? Yeah, I'm like a serial killer of trees, but there's no way around it for me. I have to check things off and cross them out and write extensive notes. Do all UNIX users need a physical notepad for this stuff? I do so far. However, I can seldom make head or tail of these notes once I am done.
It was kind of cute when I logged in under newuser and attempted to use the sudo and got reminded that I wasn't logged in as dougwelch and didn't have those admin privileges. I guess that's how we learn. But it's a lot easier to learn when you can just chuckle instead of when you are hysterical and just about ready to throw things in frustration (like the last two weeks). Just because that mouse has a tether doesn't mean you can't do some damage when you throw it out of anger/sadness/frustration.
During this unit when I was hung up, it was generally because I had miskeyed something, or had started typing while my last command was still processing/running a huge list of some kind and the last thing I was typing came up at the command prompt, I typed out the new command from the instructions and was not aware that there were some random letters at the beginning of the line, near the command prompt. But I was able to figure it out with less emotional attachment.
Nevertheless the nice thing about the command line is that you can see what it was that you keyed in and say, "Gosh, maybe I shouldn't have typed in "sudo" twice?
Oh yeah, was that grep command new? It seems like a "find" type command. Oh it's on our cheat sheet under "searching."Nice. I hope I cna use that someday and actually get what I'm looking for.
I am hoping that I have turned some kind of a corner here. No I don't think I could talk to Neal Stephenson about programming like I could talk to him about Mediterranean geography in the era of Louie Catorce, but I'm now feeling like I can do a couple things with Ubuntu and Ubuntu Server that I couldn't do a month ago.
My fear of Linux and UNIX style OSs took a little vacation to Bora Bora while I powered through with little difficulty. If only my fear took me along to the lagoon, I might be even happier. Well, I'm grabbing my snorkel and fins and watch out for the stonefish and gars!
*I took a class in BASIC in the summer of 1982 and decided to hold off until computers got "user friendly" enough for me to comprehend them. That would be about a decade and a half later.
Have I mentioned that I work well with downloading all the assignments, printing them out and keeping them on hand? Yeah, I'm like a serial killer of trees, but there's no way around it for me. I have to check things off and cross them out and write extensive notes. Do all UNIX users need a physical notepad for this stuff? I do so far. However, I can seldom make head or tail of these notes once I am done.
It was kind of cute when I logged in under newuser and attempted to use the sudo and got reminded that I wasn't logged in as dougwelch and didn't have those admin privileges. I guess that's how we learn. But it's a lot easier to learn when you can just chuckle instead of when you are hysterical and just about ready to throw things in frustration (like the last two weeks). Just because that mouse has a tether doesn't mean you can't do some damage when you throw it out of anger/sadness/frustration.
During this unit when I was hung up, it was generally because I had miskeyed something, or had started typing while my last command was still processing/running a huge list of some kind and the last thing I was typing came up at the command prompt, I typed out the new command from the instructions and was not aware that there were some random letters at the beginning of the line, near the command prompt. But I was able to figure it out with less emotional attachment.
Nevertheless the nice thing about the command line is that you can see what it was that you keyed in and say, "Gosh, maybe I shouldn't have typed in "sudo" twice?
Oh yeah, was that grep command new? It seems like a "find" type command. Oh it's on our cheat sheet under "searching."Nice. I hope I cna use that someday and actually get what I'm looking for.
I am hoping that I have turned some kind of a corner here. No I don't think I could talk to Neal Stephenson about programming like I could talk to him about Mediterranean geography in the era of Louie Catorce, but I'm now feeling like I can do a couple things with Ubuntu and Ubuntu Server that I couldn't do a month ago.
My fear of Linux and UNIX style OSs took a little vacation to Bora Bora while I powered through with little difficulty. If only my fear took me along to the lagoon, I might be even happier. Well, I'm grabbing my snorkel and fins and watch out for the stonefish and gars!
*I took a class in BASIC in the summer of 1982 and decided to hold off until computers got "user friendly" enough for me to comprehend them. That would be about a decade and a half later.
Tuesday, September 16, 2014
UNIT 3 Configuring with text editors on Ubuntu Server
UNIT 3
Well, in this unit I learned about vi and nano. I prefer nano out of the two, because I am not the world's most organized mind. vi's separate modalities were a source of frustration, but only almost as frustrating as the number of articles and websites saying "Now this part of starting up Ubuntu server will be utterly confusing and frustrating for the novice user." I would like that kind of language only if it "prepended" a good metaphor for how remember how to do the task in question, one featuring cute bunnies or traveling salesmen would be nice. Anyhow, I am told that vi can be referred to as nano without the training wheels.
So, there is a command mode and an insert mode in vi and clearly I wasn't always onboard as to which of those two modes it was currently in when I started typing text or when I wanted to either write something up or make vi do something like insert text. But I did find out that "dd" and "dy" are good friends for people who are lousy at remembering which mode they were in. Those two command are "delete character" and "delete whole word" in vi and they got rid of a lot of "I's" and "a's" which came up when I was in edit mode while thinking I was still in command mode and needed to type in the commands for "insert" and "insert right here."
Oh yeah, another priceless lesson (I should have learned last unit) is the sudo addition to commands that sometimes were not working otherwise.
I don't think I've ever done any kind of configuration before on this iMac. I think I set up devices on an old parallel port on the Dells we used at that mapping software company about 14 years ago, but I was given a printout of what I was supposed to do each step of the way, the hardest thing I recall being how to figure out a number or address for the scanner in question. That, of course required knowing which devices already attached to my computer had which addresses already. Thank goodness for the rise of USBs! Yes, there was a time that each time you added a new peripheral or device to your machine, you needed to have a disk to mount the device driver, which would explain that whole systemfile directory called "dev." Also note some of the crazy antiquated devices that are still supported by Linux, like SCSIs, tape drives, floppies et cetera.
Getting back to text editors, I remember using Notepad and Text Edit, specifically for writing html 4 code and then you could just drop into BBEdit, update the file and refresh my browser to make instant updates on web pages I was doing back before Dreamweaver made all of that seem like a waste of time. All of this I am speaking of took place in the late, late tail end of the 20th century. I initially did not see the value of unformatted text files and the very long lines of text they created, but I'm sure InDesign probably saves text that is being shoehorned into the user defined dimensions text boxes in a similar manner. It's just that the textbox objects are courteous enough to let you know if some of the text is outside the little window the user has created, a nicety that text editors do not extend to users and probably never will, considering that they do not serve the same purpose a text box in InDesign does.
Sunday, September 7, 2014
Unit 2
UNIT 2
Working in Ubuntu with the command line!!!Wow, I felt like I had a lot more control this time than at any other time! I guess the fact that I am operating in a sand-box like environment of this "Shell" as to working on an actual network, off of my own "box" gave me a better feeling of being grounded! Essentially, I just made a few directories using the mkdir command and then using the cd command, I jumped form one to the next. They had simple names like "Things_I_Like," "Yukky_Things," "Iron_and_Rust" and "Severans." These would be good directories to store images or documents. When I changed directories to "Yukky_Things" I learned, the hard way about what William Shott had been saying regarding the case sensitivity of Linux (due to a typo, the system didn't recognize my request as a directory that existed), I also followed his advice regarding using underscores instead of spaces to separate words. But I got stuck in "Yukky_Things" because there was nothing in that directory and I wasn't sure where I was, so I used cd command without any filepath, so I was taken back to the home directory (root directory in this case) and then used to ls command to see all four of the directories I had created. I had to use my computer for something else and got a little panicked as I didn't have my arrow cursor. Sadly, I had to reboot and the arrow came back, so I was able to hit the "pause" button in the Ubuntu interface. Still this is the most productive session I've had with a minimum of confusion.
UPDATE
UNIT 2.1
I have somehow muddled my way into X Windows!
Like everything I have done in the command environment, I will be surprised if I can replicate stuff I have figured out. That said, I was able to open Writer, Calc and Impress and create files in those applications, saved in my "Documents" folder. However, as a librarian/curator of multimedia I made the fatal error of saving a file with the name "Untitled." If I am able to stop VMWare and reopen the app and find my files in the "Documents" directory, I will have impressed myself. I was really happy to be working in a GUI environment briefly, especially with apps that seem to copy the MS Office apps and their tools and menus. But I know this will be leading to bigger, better and more complex things over the course of 672.
Tuesday, September 2, 2014
UNIT ONE
Well, that didn't take long!
Well folks, it's just the beginning of this class and I guess for me, it's kind of make or break. If I can make it through this class and even retain the slightest part of the content, I will hopefully have a clearer understanding of Linux, UNIX and the cold heartless realm of the command environment. Part of me thinks that if I could have ever gotten my brain wrapped around it, I would have done so already and having been exposed to the BASIC programming language in the summer of 1982, maybe that just might be the case. Nevertheless, this is the first time I have had formal, academic training in this powerful mode of computing since the rise of the internet.
I know a lot more about "computer concepts" than "Computer practice"
I have been enthralled by the ideas of open source software since I was first exposed to it in about 1998 when there was some publicity on the financial media about LINUX and Linus Torvalds himself and his radically community inclined ideas of distributing software creation and bug fixing throughout an entire not-so-organized community. That's all good, but now I personally must roll up my sleeves and dig into the work. My first query on the Ubuntu user site was about "How can someone who came to understand working and navigating across networks in the GUI Windows Explorer era build a mental model to perceive working in the UNIX/Linux command line universe?
Well, this response to someone asking about "Virtual terminals"seems to help me out a little. For some reason the metaphor of teletypes, an early precursor to, say, a fax machine which is text only is really helping me right now. Teletypes were essentially networked typewriters which could communicate over (usually dedicated) phone lines. A user at one end would enter text by typing and then the user at the other end would get a typewritten text message. That notion of computers being "glass teletypes" fits in well.
I guess my first working experience with the internet was a cruddy beige UNIX machine where everything was menu-driven and I input data, composed emails and did very crude searches on what was then called a UNIX dumb terminal.
As I mentioned above, when I worked at a more modern workplace, three years later I got a lot of help conceiving of where I was on the network with Microsoft's Explorer bar, a bar on the left of the screen that showed your computer as a terminal in the office network. Occasionally, I had to take care of work that was on a coworker's computer or utilize their software remotely from my computer. The Explorer Bar helped me keep that stuff straight. UNIX doesn't have anything like that, so I frequently can't figure out in which directory I am currently working.
As I mentioned above, when I worked at a more modern workplace, three years later I got a lot of help conceiving of where I was on the network with Microsoft's Explorer bar, a bar on the left of the screen that showed your computer as a terminal in the office network. Occasionally, I had to take care of work that was on a coworker's computer or utilize their software remotely from my computer. The Explorer Bar helped me keep that stuff straight. UNIX doesn't have anything like that, so I frequently can't figure out in which directory I am currently working.
Tuesday, August 26, 2014
Now I'm a full time student, work is expanding to fill my time!
Hoo boy, this semester promises to be a challenging one! Fortunately I have taken the plunge and cleared much of my schedule of such distractions such as "work" and "steady income" so I should have nooooo problem getting everything done in a timely manner, right? Although I am not as sartorially elegant as the guy in the illustration, I certainly share the feeling of dread and anxiety and fear of getting crushed, but hey, at least I don't have all those heavy books to lug around! I am hoping once I get out of this class to have a lot more confidence in dealing with technology issues, problems and tech matters in general. If you are reading this blog because you are also in IRLS 672, I wish you the best of luck in doing the same!
Subscribe to:
Posts (Atom)