Blog from March, 2022

Time is flying by and keeps getting away from me! There have been several changes to the program since our last NET+ LastPass update in December 2020 that I will highlight here, including updates on LastPass transitioning to an independent company, the 2021 program update,  and LastPass feature and functionality as well as answering a frequently asked question that came up in community discussions. Even though 2021 was another very challenging year for everyone, the NET+ LastPass program has continued to grow with 51 campuses signed up and several more recently added!

LastPass transitioning to an independent company

The NET+ LastPass program was formally launched during a service evaluation in 2014 with a start-up at the time named LastPass at the helm. LastPass was acquired by LogMeIn, and at the end of 2021, LogMeIn decided to transition LastPass back into an independent company. Like the other transitions, this one should have minimal impact on NET+ LastPass campuses. One benefit for campuses will be additional focused development resources being devoted to LastPass accelerating updates. We have been in contact with LastPass and the GoTo team to coordinate on details relating to the transition. Additional information from GoTo on the transition can be found here.

Status update on the 2021 program update

In the 2020 update, the first major update to the program announced campuses would be transitioning from the legacy agreement to the updated agreement that they were involved in developing. We’ve now worked though most of the campuses transitioning to the updated agreement with only a couple campuses on multi-year agreements still to transition.

Update on LastPass features and functionality

LastPass has been busy developing features and functionality of interest to the higher education community:

  • There’s a new Google Workspace integration for campuses that use Google as their directory, which makes it easier for you to manage users in LastPass, and for your end users to access their LastPass accounts.
  • New dark web monitoring is available that campuses might be interested in. I keep hearing about campuses dealing with the most recent compromised password lists and campuses’ responses. Adding in the dark web monitoring using NET+ LastPass could help alert people when their emails show up on the dark web and potentially help campus information security teams. Users can access the dark web monitoring in their Security Dashboards. For campuses that don’t want their end users to access it, there is a policy for admins to turn this off. If you’ve used this, I’d love to hear your feedback because it could be valuable to campuses.
  • LastPass Vault Accessibility Updates: Updates to the LastPass vault completed in May 2021 have removed some of the usage barriers for users with disabilities. After a recent accessibility assessment resulting in 83% WCAG (Web Content Accessibility Guidelines) 2.1 AA Compliance, LastPass’ VPAT documentationis now available for customers under an NDA.
  • An extensive list of LastPass new features and future features will be announced regularly through LastPass Insider.
  • LastPass is working on an update around the Premium as a Perk aspect of the program updating it to Families as a benefit.
  • LastPass recently blogged about cyber liability insurance as many insurance companies are requiring password management and multifactor authentication to reduce premiums.
  • LastPass also held a webinar with our partner the REN-ISAC on Password Hygiene in Higher Education: Risks, Solutions, and Strategies that may be of interest to the community. The recording can be found here.

As mentioned in previous updates, we’re discussing a potential service evaluation for LastPass MFA and LastPass SSO. If you’re interested in that, please reach out to me.

Answering a frequently asked question

One of the most frequently asked questions in community discussion pertains to price increases. Why have there been so many price increases? The campuses asking those questions are on direct agreements for LastPass, so I’m not familiar with the details. For the NET+ LastPass program, we have had one price increase for new and existing campuses since the start of the program in 2015. The NET+ LastPass customer agreement also has a term in it that limits price increases for existing campuses to up to 5% annually. This is one of the many strong contract terms that campuses negotiated with LastPass as part of the NET+ LastPass program.


That’s a lot of updates, so thanks for reading to the end! iIf you have any questions, suggestions, or want to get involved in the NET+ LastPass program, please reach out to me!

Here at Internet2, we are fortunate to be working with a wonderful group of students from Notre Dame's Master of Science in Business Analytics program. The group is working to help us gain insight from detailed usage data we get from the NET+ AWS and GCP programs. Our hopes are that we will be able to use that data to observe emerging patterns of cloud infrastructure in higher education and research, and to use that knowledge to help the community support effective cloud use.

In order to provide analytic access to the data, which is kept in Google Big Query tables, we wanted to provide the students with a Jupyter notebook environment where they would not need to download or store the data on their own personal laptops while they work with us. This post documents how we are providing that environment using Managed Notebooks in GCP's Vertex AI Workbench.

We have set up a Google Group for the class project, containing the members of the class as well as the Internet2 staff working on the project with them. In order to allow the group the ability to create notebooks, we added the Notebooks Admin role for the group within our GCP project (as described in ( Open question: Would Notebooks Runner be adequate for our purposes?

For our purposes, as we only have four students in our group, we used the GCP Console to manually create the notebooks. The process could be automated  for larger repeated use (or one could use Google's Rad Lab Data Science repo).

The process for creating Managed Notebooks is documented here:

At present Managed Notebooks are only available for a single user, so we created an individual instance for each student, naming each notebook with the student's email identifier. Each notebook can be assigned a single owner (at the bottom of the Advanced Settings screen), which is where you assign the notebook to the student's email address.

To help in managing costs, we reduced the size of the instances from the default n1-standard-4 to n1-standard-2, and reduced the idle timeout period from 180 minutes to 60 minutes.

The result of creating notebooks manually in the console is a running notebook process, viewable in the Vertex AI Workbench screen in the console. We then stop those processes, as we will rely on the students to start them up when they want to use a notebook.

To give the notebooks access to our Big Query tables required assigning the BigQuery Read Session User role to our group. The group already had the BigQuery Data Viewer and BigQuery Job User roles assigned within our project.

The process for accessing Big Query data from a Jupyter notebook is documented here:

Because we are using GCP Managed Notebooks, all the necessary pieces for accessing Big Query are pre-installed (as are the usual Python data science modules), so the notebooks are ready to go once started.

We anticipate very low costs for using this service: Managed Notebooks are currently in Preview, and there is no management fee for managed notebooks while in Preview. The instance costs for the n1-standard-2 machines are $0.10 per hour. There can be costs for queries submitted to Big Query, but we anticipate that our uses will remain well within the free tier of Big Query usage.

Many thanks to Maddie Howe for helping to test and troubleshoot this process!

We sent out the following instructions to the students to let them know how to access their notebooks.

I’ve set you each up with a Jupyter environment in our GCP organization for work on the capstone project. 
To get to the environment, follow these instructions:
  1. Go to the Managed Notebooks page in the GCP console:
  2. You should see a notebook named with your email id – e.g. nd-capstone-jdoe
  3. Click in the checkbox next to your notebook name and then click on the Start icon up on the Workbench 
    menu line at the top of the page.
    (if you don’t see the Start icon, click on the three dots there and you will).
    It takes 5-10 minutes to spin up the instance.
  4. Once your instance is running, click on Open Jupyterlab and you’ll get a new tab with 
    Jupyterlab – that can also take a few minutes.
  5. You can then start a new notebook.
  6. You should be able to access our Big Query tables as documented here:
A sample query to test:

%%bigquery testdf
SELECT distinct Product_Name FROM `projectname.datasetname.tablename`
order by Product_Name

That will put the result of the query in a pandas dataframe called testdf. To verify:

A few notes:
- When you’re done using Jupyter, please go back into the console and stop your instance.
- The instances time out after 60 minutes of no use, so it’s not the end of the world
if you don’t stop it, but it’s a good practice to get into.
- The instances are not huge – 2 CPU, 7.5 GB of RAM, no GPU, 100 GB of disk. If you need more power, please let me know.

Update: March 9, 2022

Aaron Gussman from Google sent along an example of using the notebooks API to create a managed notebook instance, which doesn't appear to be in Google's documentation anywhere yet.

Here is the API example to create a Managed Notebooks runtime with Idle Shutdown settings:




AUTH_TOKEN=$(gcloud auth application-default print-access-token)




  'access_config': {

    'access_type': 'SINGLE_USER',

    'runtime_owner': '${OWNER_EMAIL}'


  'software_config': {

    'idle_shutdown': true,

    'idle_shutdown_timeout': 180



curl -X POST https://${BASE_ADDRESS}/v1/projects/$PROJECT_ID/locations/$LOCATION/runtimes?runtime_id=${RUNTIME_ID} -d "${RUNTIME_BODY}" \

 -H "Content-Type: application/json" \

 -H "Authorization: Bearer $AUTH_TOKEN" -v